(no title)
ikt | 7 months ago
maybe 128gb of vram becomes the new mid tier model and most llms can fit into this nicely and do everything one wants in an llm
given how fast llms are progressing it wouldn’t surprise me if we reach this point by 2030
oxcidized|7 months ago
I hope I'm wrong though, and we see a large bump soon. Even just 32GB in the mid tier would be huge.
I'm really tempted to try out a Mac Studio with 256+ GB Unified Memory (192 GB VRAM), but it is sadly out of my budget at the moment. I know there is a bandwidth loss, but being able to run huge models and huge contexts locally would be quite nice.