4. The Ollama Course - Using the CLI

4. The Ollama Course - Using the CLI

Matt Williams

4 месяца назад

10,208 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

@ABatorfi
@ABatorfi - 14.08.2024 02:23

Thank you for this awesome course, I‘m enjoying it!

Ответить
@romulopontual6254
@romulopontual6254 - 14.08.2024 02:47

Very nice! Thank you.

Ответить
@federicoarg00
@federicoarg00 - 14.08.2024 03:05

this is amazing, super clear, thank you!

Ответить
@shuntera
@shuntera - 14.08.2024 04:45

Love this, I think we’ve all been doing just-in-time learning to run and keep up to date with what’s happening every couple of weeks. Great to tear it back to the foundations Matt

Ответить
@vulcan4d
@vulcan4d - 14.08.2024 05:13

Removing models is the most annoying part because you have to name it exact. Wish they made it easier to just select and delete via GUI or list and select to remove by a number

Ответить
@NLPprompter
@NLPprompter - 14.08.2024 05:24

ah... the ollsma serve... LOL i was wasted a week until i realized it was user issue in Linux, i felt so stupid having duplicate models, and things... this is really good video any one new to ollama should watch this if i watch this before i wouldn't waste a week just to realized how stupid i am the simple user issue...

Ответить
@build.aiagents
@build.aiagents - 14.08.2024 05:41

More cool stuff please!

Ответить
@sammcj2000
@sammcj2000 - 14.08.2024 06:08

I wouldn’t recommend creating models the legacy Q4_0 quant types, they’re depreciated and are worse quality than K quants (or IQ if you’re running with CUDA)

Ответить
@PBrioschi
@PBrioschi - 14.08.2024 06:15

Hi Matt, thank you for another amazing content.
I'm working with ollama and other tools available from community to develop some solutions for my company.
I need some help from a professional consultant for this job.
Could you work with me, or, maybe, recommend a person who can help me to do it?

Ответить
@marianoarganaraz
@marianoarganaraz - 14.08.2024 06:58

I love the way you explain. Thanks

Ответить
@artur50
@artur50 - 14.08.2024 07:46

Excellent content Matt! Congrats! Keep on going.

Ответить
@JNET_Reloaded
@JNET_Reloaded - 14.08.2024 12:36

any1 wana swap code for tokens?

Ответить
@JNET_Reloaded
@JNET_Reloaded - 14.08.2024 12:43

what location to run that download hugging face model command? and where does it download to? same location as the others wheres that?

Ответить
@jimlynch9390
@jimlynch9390 - 14.08.2024 12:46

I'm really enjoying this series. Thanks.

Ответить
@AliAlias
@AliAlias - 14.08.2024 14:01

🙏🙏🙏Please how to add vision model MiniCPM-V 2.6 to ollama?
openbmb/MiniCPM-V-2_6-gguf

Ответить
@MichaPiotrKozowski
@MichaPiotrKozowski - 14.08.2024 15:21

Thank you!

Ответить
@derekf1111
@derekf1111 - 14.08.2024 15:44

I love your videos! Your explanations are amazing, thank you!

Ответить
@fabriai
@fabriai - 14.08.2024 16:36

Wonderful video, Matt. Thanks so much for sharing this.

Ответить
@ISK_VAGR
@ISK_VAGR - 14.08.2024 16:51

Man. I love it. I already subscribe to it. Something I really will be crazy to know is how to story my ollama local models in an external hard drive in Mac. As you know macs doesn't have much space. So, i bought a special hard drive that runs at 40G/sec to have models and other stuff and I will love to have the models in there than in my internal hard drive. Thanks for the great content and explanations.

Ответить
@mpesakapoeta
@mpesakapoeta - 14.08.2024 19:38

Please share the link of the video for reducing the model size for specific tasks, example, only weather, is wouldn't need the hole context for this

Ответить
@pythonantole9892
@pythonantole9892 - 15.08.2024 11:06

I have a noob question. If anybody can upload a model on Ollama, is it possible for a malicious user to upload malware disguised as a model? And are there measures to prevent such a scenario.

Ответить
@engineermajidd
@engineermajidd - 21.08.2024 02:33

Looking forward for your next video

Ответить
@AricRastley
@AricRastley - 23.09.2024 07:27

Exactly what I was looking for! THANK YOU!

Ответить
@jjolla6391
@jjolla6391 - 15.11.2024 22:03

thanks, m :)

Ответить
@hassanaoude5668
@hassanaoude5668 - 18.11.2024 16:47

I’m trying to use ollama serve to integrate my app with ollama however a lot of functionality is not working when using ollama serve for example I can list pull and rm models using serve but when I try to load a model to memory via API or just run model in terminal it crashes base on the log it just says ollama crashes and restarted, for now I’m using the ollama app.exe to start the full application to unblock development but I can’t find any documentation to help troubleshoot the issue, I would really appreciate if you can have a video just about using ollama serve command and it’s how you’d use it

Ответить
@brandonmelloy2387
@brandonmelloy2387 - 24.11.2024 22:20

for those using mac, type nano modelfile in the command line to create the modelfile

Ответить
@AnandVimalanathan
@AnandVimalanathan - 17.12.2024 17:17

Thanks

Ответить