The HARD Truth About Hosting Your Own LLMs

The HARD Truth About Hosting Your Own LLMs

Cole Medin

1 месяц назад

13,275 Просмотров

Hosting your own LLMs like Llama 3.1 requires INSANELY good hardware - often times making running your own LLMs completely unrealistic. But I have a strategy that I reveal in this video for how to start cheap with self-hostable LLMs and then continue on to scale with them infinitely as your app/business grows...

00:00 - 02:58 - The Problem with Local LLMs
02:59 - 03:35 - The Strategy for Local LLMs
03:36 - 08:02 - Exploring Groq's Amazingness
08:03 - 13:59 - The Groq to Local LLM Quick Maths
14:00 - 14:43 - Outro

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Services I mentioned in this video (I am not sponsored by any of them):

Groq: https://groq.com/
RunPod: https://www.runpod.io/
DigitalOcean: https://www.digitalocean.com/

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Artificial Intelligence is no doubt the future of not just software development but the whole world. And I'm on a mission to master it - focusing first on mastering AI Agents.

Join me as I push the limits of what is possible with AI. I'll be uploading videos at least two times a week - Sundays and Wednesdays at 7:00 PM CDT! Sundays and Wednesdays are for everything AI, focusing on providing insane and practical educational value. I will also post sometimes on Fridays at 7:00 PM CDT - specifically for platform showcases - sometimes sponsored, always creative in approach!

Тэги:

#ai #artificial_intelligence #ai_agents #software_engineering #software_development #coding #automation #saas #development #local_ai #local_llms #ai_cost #local_ai_cost #self-hosting_llms #llama_3.1 #llama_3.1_70b #llama_3.1_8b #price_of_hosting_llms #mixtral #mistral #falcon_llm #open_source_llms #groq #groq_pricing #groq_speed #fast_llm #llm_speed
Ссылки и html тэги не поддерживаются


Комментарии: