should i dump like 50k into a local LLM setup
| Bossy area | 12/17/25 | | Dashing shimmering internal respiration | 12/17/25 | | Bossy area | 12/17/25 | | Dashing shimmering internal respiration | 12/17/25 | | mentally impaired piazza | 12/17/25 | | Dashing shimmering internal respiration | 12/17/25 | | mentally impaired piazza | 12/17/25 | | Dashing shimmering internal respiration | 12/17/25 | | mentally impaired piazza | 12/17/25 | | Federal incel resort | 12/17/25 | | mentally impaired piazza | 12/17/25 | | Dashing shimmering internal respiration | 12/17/25 | | Bossy area | 12/17/25 | | Federal incel resort | 12/17/25 | | Bossy area | 12/17/25 | | Dashing shimmering internal respiration | 12/17/25 | | Federal incel resort | 12/17/25 | | Dashing shimmering internal respiration | 12/17/25 | | low-t milk | 12/17/25 | | shaky jet-lagged twinkling uncleanness | 12/17/25 | | Nubile Box Office Yarmulke | 12/17/25 | | cracking alcoholic national round eye | 12/17/25 | | clear roommate goal in life | 12/17/25 | | spruce deranged hall associate | 12/17/25 | | Bossy area | 12/17/25 |
Poast new message in this thread
Date: December 17th, 2025 1:33 PM Author: mentally impaired piazza
no lol there is still a huge disparity in the speed/quality of frontier models vs what is able to be self hosted. it's just the new thing for nerds to blow massive amounts of $$$ on to feel cool
self hosted llms are great for niche purposes (like they can basically solve the problem of Searching your OS being Awful), but those are going to be so small you won't need much to run them.
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2,#49516844) |
|
|