(no title)
pinusc | 1 year ago
Unless of course you were talking about VRAM, in which case 16GB is still not great for ML (to be fair, the 24GB of an RTX 4090 aren't either, but there's not much more you can do in the space of consumer hardware). I don't think the other commenter was talking about VRAM, because 16GB VRAM are very overkill for everyday computing... and pretty decent for most gaming.
tosh|1 year ago
You don’t need a GPU for llm inference. Might not be as fast as it could be but usable.
hnfong|1 year ago
999900000999|1 year ago
I don't want to have a laptop over 3 pounds and I'm not spending over 1100$, so a dedicated GPU isn't really an option.