\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

we can't ban open source AI so we'll remove the ability to run it locally

...
lime codepig mediation
  12/17/25
are you talking about the RAM shortage
Cordovan mildly autistic site dog poop
  12/17/25
GPUs, RAM, SSDs
lime codepig mediation
  12/17/25
it's over
vibrant pale pervert church
  12/17/25
...
Ruby harsh heaven
  12/17/25
if i flicked nickels at your deformed oblong skull what kind...
federal national
  12/17/25
why don't you "ping" your pencil neck you weird li...
lime codepig mediation
  12/17/25
my neck is huge faggot how bout i rape you with it? show arm...
federal national
  12/17/25
lol what a fag
lime codepig mediation
  12/17/25
the commercial hardware rugpull happening right now is extre...
vibrant pale pervert church
  12/17/25
...
lime codepig mediation
  12/17/25
seriously this shit is very concerning i thought this wou...
vibrant pale pervert church
  12/17/25
NSAM was right
lime codepig mediation
  12/17/25
name a time NSAM was wrong, chud
know-it-all dull stage
  12/17/25
...
lime codepig mediation
  12/17/25
I’m on full fucking tilt rn
Ruby harsh heaven
  12/17/25
This is exactly why I set up a local lm studio, ollama and m...
know-it-all dull stage
  12/17/25
18000
lime codepig mediation
  12/17/25
how's the performance?
spruce roast beef
  12/17/25
completely fine on 30B, shit after
know-it-all dull stage
  12/17/25
...
lime codepig mediation
  12/17/25
Is it a Mac? How many GBs of ram do you use?
Mischievous crotch
  12/17/25
im on a 5090 w a 9950X3D but im currently in the process of ...
know-it-all dull stage
  12/17/25
how tf do you get a bunch of gpus for free
vibrant pale pervert church
  12/17/25
Large company upgrading and throwing them away and I just to...
know-it-all dull stage
  12/17/25
i literally just found out about Big-Tiger-Gemma and its by ...
know-it-all dull stage
  12/17/25
how large is the database? how does this work? facebook scra...
spruce roast beef
  12/17/25
here are the normie instructions for low iq mos https://l...
know-it-all dull stage
  12/17/25
i poast on a laptop. i'm just curious.
spruce roast beef
  12/17/25
...
vibrant pale pervert church
  12/17/25
LM Studio pushes glitchy, unoptimized models. Ollama curates...
Light Histrionic Hissy Fit Public Bath
  12/17/25
thanks NSAM, if you read the first line you'd see "h...
know-it-all dull stage
  12/17/25
also they all pull them from huggingface so no clue wtf idio...
know-it-all dull stage
  12/17/25
thought this would be NSAM
naked ruddy den
  12/17/25
...
lime codepig mediation
  12/17/25
...
sick spectacular hominid
  12/17/25
...
Provocative Legend Business Firm
  12/17/25
this whole thread sounds like nsam talking to himself.
lilac duck-like range
  12/17/25
...
Ruby harsh heaven
  12/17/25
...
Floppy Bat Shit Crazy Center
  12/17/25
...
lime codepig mediation
  12/17/25
i agree this is some really concerning shit ricky i'm not ev...
vibrant pale pervert church
  12/17/25
...
Ruby harsh heaven
  12/17/25
...
lime codepig mediation
  12/17/25
...
vibrant pale pervert church
  12/17/25
...
sick spectacular hominid
  12/17/25
I was lucky I built my 5070 Ti, 48gb RAM build recently but ...
Cordovan mildly autistic site dog poop
  12/17/25
...
Ruby harsh heaven
  12/17/25
You need 24gb VRAM to do inference. 16gb isn't enough. Howev...
Light Histrionic Hissy Fit Public Bath
  12/17/25
I waited on my RTX pro 6000 and now I'm pissed
Floppy Bat Shit Crazy Center
  12/17/25


Poast new message in this thread



Reply Favorite

Date: December 17th, 2025 9:16 AM
Author: lime codepig mediation



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516104)



Reply Favorite

Date: December 17th, 2025 1:26 PM
Author: Cordovan mildly autistic site dog poop

are you talking about the RAM shortage

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516818)



Reply Favorite

Date: December 17th, 2025 1:31 PM
Author: lime codepig mediation

GPUs, RAM, SSDs



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516836)



Reply Favorite

Date: December 17th, 2025 1:32 PM
Author: vibrant pale pervert church

it's over

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516841)



Reply Favorite

Date: December 17th, 2025 1:36 PM
Author: Ruby harsh heaven



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516856)



Reply Favorite

Date: December 17th, 2025 9:22 AM
Author: federal national

if i flicked nickels at your deformed oblong skull what kind of noise would it make

like a "ping" or would it be more of a squelch

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516121)



Reply Favorite

Date: December 17th, 2025 9:25 AM
Author: lime codepig mediation

why don't you "ping" your pencil neck you weird little freak



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516133)



Reply Favorite

Date: December 17th, 2025 9:40 AM
Author: federal national

my neck is huge faggot how bout i rape you with it? show armpit

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516174)



Reply Favorite

Date: December 17th, 2025 9:52 AM
Author: lime codepig mediation

lol what a fag

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516208)



Reply Favorite

Date: December 17th, 2025 9:23 AM
Author: vibrant pale pervert church

the commercial hardware rugpull happening right now is extremely radicalizing

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516126)



Reply Favorite

Date: December 17th, 2025 9:24 AM
Author: lime codepig mediation



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516127)



Reply Favorite

Date: December 17th, 2025 9:44 AM
Author: vibrant pale pervert church

seriously this shit is very concerning

i thought this would happen at some point in the future but it's happening already. things are accelerating very fast

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516186)



Reply Favorite

Date: December 17th, 2025 11:25 AM
Author: lime codepig mediation

NSAM was right

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516456)



Reply Favorite

Date: December 17th, 2025 11:29 AM
Author: know-it-all dull stage

name a time NSAM was wrong, chud

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516465)



Reply Favorite

Date: December 17th, 2025 11:33 AM
Author: lime codepig mediation



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516483)



Reply Favorite

Date: December 17th, 2025 1:09 PM
Author: Ruby harsh heaven

I’m on full fucking tilt rn

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516748)



Reply Favorite

Date: December 17th, 2025 9:27 AM
Author: know-it-all dull stage

This is exactly why I set up a local lm studio, ollama and my own custom build that uses publicly available llms

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516143)



Reply Favorite

Date: December 17th, 2025 9:28 AM
Author: lime codepig mediation

18000

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516147)



Reply Favorite

Date: December 17th, 2025 9:41 AM
Author: spruce roast beef

how's the performance?

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516178)



Reply Favorite

Date: December 17th, 2025 9:45 AM
Author: know-it-all dull stage

completely fine on 30B, shit after

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516189)



Reply Favorite

Date: December 17th, 2025 9:52 AM
Author: lime codepig mediation



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516210)



Reply Favorite

Date: December 17th, 2025 11:58 AM
Author: Mischievous crotch

Is it a Mac? How many GBs of ram do you use?

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516550)



Reply Favorite

Date: December 17th, 2025 12:04 PM
Author: know-it-all dull stage

im on a 5090 w a 9950X3D but im currently in the process of making a pretty powerful frankenstein machine with a bunch of gpus i got free

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516564)



Reply Favorite

Date: December 17th, 2025 12:06 PM
Author: vibrant pale pervert church

how tf do you get a bunch of gpus for free

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516570)



Reply Favorite

Date: December 17th, 2025 12:10 PM
Author: know-it-all dull stage

Large company upgrading and throwing them away and I just took them. All are pretty powerful on their own.

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516583)



Reply Favorite

Date: December 17th, 2025 11:57 AM
Author: know-it-all dull stage

i literally just found out about Big-Tiger-Gemma and its by far the best unchained/uncensored LLM ive found yet

holy shit this thing just helped me remove all shadows in an online game and its completely undetectable

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516548)



Reply Favorite

Date: December 17th, 2025 1:08 PM
Author: spruce roast beef

how large is the database? how does this work? facebook scraped data legally and illegally, transformed them into matrices, and now they're available to anyone?

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516746)



Reply Favorite

Date: December 17th, 2025 1:13 PM
Author: know-it-all dull stage

here are the normie instructions for low iq mos

https://lmstudio.ai/

+

https://huggingface.co/TheDrummer/Big-Tiger-Gemma-27B-v1

also if you cant answer these questions on your own you probably dont even have a strong enough setup to begin with

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516771)



Reply Favorite

Date: December 17th, 2025 1:20 PM
Author: spruce roast beef

i poast on a laptop. i'm just curious.

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516803)



Reply Favorite

Date: December 17th, 2025 1:29 PM
Author: vibrant pale pervert church



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516824)



Reply Favorite

Date: December 17th, 2025 1:38 PM
Author: Light Histrionic Hissy Fit Public Bath

LM Studio pushes glitchy, unoptimized models. Ollama curates so you only get optimized models. You can connect to an Ollama server with OpenWebUI or AnythingLLM and do whatever

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516871)



Reply Favorite

Date: December 17th, 2025 1:39 PM
Author: know-it-all dull stage

thanks NSAM, if you read the first line you'd see

"here are the normie instructions for low iq mos"

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516880)



Reply Favorite

Date: December 17th, 2025 1:41 PM
Author: know-it-all dull stage

also they all pull them from huggingface so no clue wtf idiocy ur talking about

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516888)



Reply Favorite

Date: December 17th, 2025 9:40 AM
Author: naked ruddy den

thought this would be NSAM

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516175)



Reply Favorite

Date: December 17th, 2025 9:49 AM
Author: lime codepig mediation



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516200)



Reply Favorite

Date: December 17th, 2025 9:58 AM
Author: sick spectacular hominid



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516223)



Reply Favorite

Date: December 17th, 2025 9:58 AM
Author: Provocative Legend Business Firm



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516226)



Reply Favorite

Date: December 17th, 2025 1:30 PM
Author: lilac duck-like range

this whole thread sounds like nsam talking to himself.

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516830)



Reply Favorite

Date: December 17th, 2025 1:11 PM
Author: Ruby harsh heaven



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516759)



Reply Favorite

Date: December 17th, 2025 1:11 PM
Author: Floppy Bat Shit Crazy Center



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516762)



Reply Favorite

Date: December 17th, 2025 1:12 PM
Author: lime codepig mediation



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516764)



Reply Favorite

Date: December 17th, 2025 1:12 PM
Author: vibrant pale pervert church

i agree this is some really concerning shit ricky i'm not even playing around

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516765)



Reply Favorite

Date: December 17th, 2025 1:16 PM
Author: Ruby harsh heaven



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516788)



Reply Favorite

Date: December 17th, 2025 1:17 PM
Author: lime codepig mediation



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516792)



Reply Favorite

Date: December 17th, 2025 1:25 PM
Author: vibrant pale pervert church



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516814)



Reply Favorite

Date: December 17th, 2025 1:29 PM
Author: sick spectacular hominid



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516825)



Reply Favorite

Date: December 17th, 2025 1:34 PM
Author: Cordovan mildly autistic site dog poop

I was lucky I built my 5070 Ti, 48gb RAM build recently but that is pretty weak for LLMs afaik

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516849)



Reply Favorite

Date: December 17th, 2025 1:35 PM
Author: Ruby harsh heaven



(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516851)



Reply Favorite

Date: December 17th, 2025 7:49 PM
Author: Light Histrionic Hissy Fit Public Bath

You need 24gb VRAM to do inference. 16gb isn't enough. However, unless you are coding there's little reason to go above 24gb. The reason coding can use more VRAM is because going through each iteration of the code generates long context windows. If you run out of context window the AI will forget what it was doing earlier. This is also why you can't run 15gb models on 16gb of VRAM. The context windows spills into system RAM and slows everything down

48gb VRAM lets you do more with image and video generation, but will not give you measurable gains in inference. You can put bigger models on the system but they probably won't give you better results.

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49517900)



Reply Favorite

Date: December 17th, 2025 1:36 PM
Author: Floppy Bat Shit Crazy Center

I waited on my RTX pro 6000 and now I'm pissed

(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2"#49516855)