should i dump like 50k into a local LLM setup
| galvanic chrome locale | 12/17/25 | | Laughsome pit | 12/17/25 | | galvanic chrome locale | 12/17/25 | | Laughsome pit | 12/17/25 | | Thriller university jew | 12/17/25 | | Laughsome pit | 12/17/25 | | Thriller university jew | 12/17/25 | | Laughsome pit | 12/17/25 | | Thriller university jew | 12/17/25 | | Self-absorbed codepig | 12/17/25 | | Thriller university jew | 12/17/25 | | Laughsome pit | 12/17/25 | | galvanic chrome locale | 12/17/25 | | Self-absorbed codepig | 12/17/25 | | galvanic chrome locale | 12/17/25 | | Laughsome pit | 12/17/25 | | Self-absorbed codepig | 12/17/25 | | Laughsome pit | 12/17/25 | | Deranged zombie-like ticket booth karate | 12/17/25 | | swollen knife reading party | 12/17/25 | | Slap-happy Plaza | 12/17/25 | | Drab Fragrant Base | 12/17/25 | | spruce goyim spot | 12/17/25 | | Elite green ladyboy hissy fit | 12/17/25 | | galvanic chrome locale | 12/17/25 |
Poast new message in this thread
Date: December 17th, 2025 1:33 PM Author: Thriller university jew
no lol there is still a huge disparity in the speed/quality of frontier models vs what is able to be self hosted. it's just the new thing for nerds to blow massive amounts of $$$ on to feel cool
self hosted llms are great for niche purposes (like they can basically solve the problem of Searching your OS being Awful), but those are going to be so small you won't need much to run them.
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2betting#49516844) |
|
|