should i dump like 50k into a local LLM setup
| e-girl enthusiast | 12/17/25 | | https://i.imgur.com/chK2k5a.jpeg | 12/17/25 | | e-girl enthusiast | 12/17/25 | | https://i.imgur.com/chK2k5a.jpeg | 12/17/25 | | prospero ano y luicidad | 12/17/25 | | https://i.imgur.com/chK2k5a.jpeg | 12/17/25 | | prospero ano y luicidad | 12/17/25 | | https://i.imgur.com/chK2k5a.jpeg | 12/17/25 | | prospero ano y luicidad | 12/17/25 | | Taylor Swift is not a hobby she is a lifestyle | 12/17/25 | | prospero ano y luicidad | 12/17/25 | | https://i.imgur.com/chK2k5a.jpeg | 12/17/25 | | e-girl enthusiast | 12/17/25 | | Taylor Swift is not a hobby she is a lifestyle | 12/17/25 | | e-girl enthusiast | 12/17/25 | | https://i.imgur.com/chK2k5a.jpeg | 12/17/25 | | Taylor Swift is not a hobby she is a lifestyle | 12/17/25 | | https://i.imgur.com/chK2k5a.jpeg | 12/17/25 | | incompetent fraud | 12/17/25 | | Pocari Sweat | 12/17/25 | | (*)> | 12/17/25 | | metaepistemology is trans | 12/17/25 | | i gave my cousin head | 12/17/25 | | merry screenmas | 12/17/25 | | e-girl enthusiast | 12/17/25 |
Poast new message in this thread
Date: December 17th, 2025 1:33 PM Author: prospero ano y luicidad (馃)
no lol there is still a huge disparity in the speed/quality of frontier models vs what is able to be self hosted. it's just the new thing for nerds to blow massive amounts of $$$ on to feel cool
self hosted llms are great for niche purposes (like they can basically solve the problem of Searching your OS being Awful), but those are going to be so small you won't need much to run them.
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2,#49516844) |
 |
Date: December 17th, 2025 2:06 PM
Author: https://i.imgur.com/chK2k5a.jpeg
For coding there's no reason to do it locally. You need extra RAM just for the long ass context windows coding requires.
For anything else there are good reasons to keep it off the cloud and not have it linked to your credit card
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2,#49516971) |
|
|