Exactly, I’m in the same situation now and the 8GB in those cheaper cards don’t even let you run a 13B model. I’m trying to research if I can run a 13B one on a 3060 with 12 GB.
I also have a 3060, can you detail which framework (sglang, ollama, etc) you are using and how you got that speed? i’m having trouble reaching that level of performance. Thx
Edward Snowden doing GPU reviews? This timeline is becoming weirder every day.
“Whistleblows” as if he’s some kind of NVIDIA insider.
Intel Insider now that would’ve made for great whistleblowing headlines.
Legitimately thought this was a hard-drive.net post
I bet he just wants a card to self host models and not give companies his data, but the amount of vram is indeed ridiculous.
Exactly, I’m in the same situation now and the 8GB in those cheaper cards don’t even let you run a 13B model. I’m trying to research if I can run a 13B one on a 3060 with 12 GB.
You can. I’m running a 14B deepseek model on mine. It achieves 28 t/s.
You need a pretty large context window to fit all the reasoning, ollama forces 2048 by default and more uses more memory
I also have a 3060, can you detail which framework (sglang, ollama, etc) you are using and how you got that speed? i’m having trouble reaching that level of performance. Thx
Oh nice, that’s faster than I imagined.
Swear next he’s gonna review hentai games
Oh wait… https://www.youtube.com/watch?v=fAf1Syz17JE
“Some hentai games are good” -Edward Snowden
Note that this is from 2003
I’ll keep believing this is a theonion post