awesomesauce309@midwest.socialtoSelfhosted@lemmy.world•Using Mac M2 Ultra 192GB to Self-Host LLMs?English
0·
19 days agoFor context length, vram is important, you can’t break contexts across memory pools so it would be limited to maybe 16gb. With m series you can have a lot more space since ram/vram are the same, but its ram at apple prices. You can get a +24gb setup way cheaper than some nvidia server card though
Growing up I always wrote off “it’s always in the last place you look” as just another random thing adults loved to just say all the time.