I wonder what would be your suggestion for someone who wants to build his own first ML machine (and actually first computer as well). I would like to start small and add (graphics) cards and then maybe upgrade them later one by one. Should I go with high-end cards? latest? Thanks for any good pointers as well.(sorry for the off topic question)
dagw|3 years ago
As for the rest of the components, don't sweat it too much. As to which is actually more important, it depends a lot on exactly what you are doing. 32 GB of RAM, basically any 'mid range' CPU from the past few years and an SSD an you're good to go.
One thing worth considering however is getting a latest gen CPU/motherboard so that you can upgrade the CPU in the future. For example, while you will probably find great deals on AMD AM4 CPUs and Motherboards today, you won't be able to upgrade to the latest AMD CPU in a couple of years since AM4 is end of life. Getting an AM5 CPU/motherboard will cost more up front but you will have the option to upgrade the CPU down the line.
Also as a final point, ask yourself do you really need an ML machine? If you're just 'playing around' consider sites like Colab and others that offer free or cheap GPUs for training ML models. I know lots of people who have done real ML research and publish actual ML papers using only the free tier Colab, so you can get quite far on it.
rjh29|3 years ago
For GPU it depends on your vram needs, the Stable Diffusion stuff seems fine with 8, but Textual Inversion (adding yourself to the Stable Diffusion model, etc.) and text/transformers work needs more. I wanted the latter so I bought a used 3090 which was relatively cheap (compared to before, anyway!)
useful|3 years ago
Get a 3080ti or 3060 with 12gb of ram. If you want to experiment with very high memory models then pay for colab pro.
rjh29|3 years ago
joshvm|3 years ago
Next question is what are you training?
The latest cards are almost certainly faster, but you're normally concerned by VRAM than absolute speed. The 3090 (or the older Titan series) is unique in consumer cards in that you get almost double the RAM of the next cards down. I've had no problem training models on an 1080ti 8GB card but I wouldn't want to go smaller than that. System RAM is cheap in comparison, you can easily spec a 128GB machine, M.2 SSDs are a must for fast data loading, etc. Most CPUs are quads now, but more cores and lots of RAM means you can run more parallel dataloaders.
Otherwise there are plenty of guides that do a cost analysis of which cards to buy. If money no object then the 3090/Titan series, but maybe you don't need that much RAM. I saw a sibling comment mentions issues with heat - the stock 3090 has terrible VRAM cooling. I water cooled mine and it works well, but it's not cheap at all.
mengibar10|3 years ago
Actually I am more into RL than DL. But DRL use DL.
I don't need a machine at the moment, I am in the learning state at the moment. But eventually I will need something and passively want to learn on building my own machine. So trying to avoid costly mistakes.
It will be NVIDIA cards for sure. It does not have to be the latest, but I would like to be able to upgrade the cards without need to change the rest of the machine, at least not more than 10% of initial cost. I am very ignorant on this that's why asked how to build a machine starting with cheaper graphics cards but then be able to upgrade to latest and the best if I need to. I guess motherboard is the most important component but which one? Also, size of the box. Any article that discuss these in depth? Not videos, I prefer reading.
Money is not an issue unless I don't make the mistake of buying wrong components that does not fit both physically and compatibility.
I would like to build a compact machine yet able to upgrade, for example 2 graphics card eventually.
WithinReason|3 years ago
unknown|3 years ago
[deleted]