top | item 33329868

(no title)

mengibar10 | 3 years ago

I wonder what would be your suggestion for someone who wants to build his own first ML machine (and actually first computer as well). I would like to start small and add (graphics) cards and then maybe upgrade them later one by one. Should I go with high-end cards? latest? Thanks for any good pointers as well.

(sorry for the off topic question)

discuss

order

dagw|3 years ago

GPU pricing is kind of all over the place at the moment, but, assuming you are on a budget, I would look for either a RTX 2070 (super if you can) or GTX 1080 Ti as great 'starter' cards for ML. Both should be available second hand for a fairly reasonable price. If you want a 'new' card, I would go for the 3070 or 3060. The 3060 is interesting since while it has few and slower cores than the 3070, it has more VRAM which is important for certain workload.

As for the rest of the components, don't sweat it too much. As to which is actually more important, it depends a lot on exactly what you are doing. 32 GB of RAM, basically any 'mid range' CPU from the past few years and an SSD an you're good to go.

One thing worth considering however is getting a latest gen CPU/motherboard so that you can upgrade the CPU in the future. For example, while you will probably find great deals on AMD AM4 CPUs and Motherboards today, you won't be able to upgrade to the latest AMD CPU in a couple of years since AM4 is end of life. Getting an AM5 CPU/motherboard will cost more up front but you will have the option to upgrade the CPU down the line.

Also as a final point, ask yourself do you really need an ML machine? If you're just 'playing around' consider sites like Colab and others that offer free or cheap GPUs for training ML models. I know lots of people who have done real ML research and publish actual ML papers using only the free tier Colab, so you can get quite far on it.

rjh29|3 years ago

You have to pay more for the CPU, mobo and DDR5 ram, so it's quite a tax right now. It depends on your workload of course, but if it's mostly GPU-bound (ML, gaming) I think the AM4 5800X3D is a really good choice that is destroying benchmarks and should last 4-5 years.

For GPU it depends on your vram needs, the Stable Diffusion stuff seems fine with 8, but Textual Inversion (adding yourself to the Stable Diffusion model, etc.) and text/transformers work needs more. I wanted the latter so I bought a used 3090 which was relatively cheap (compared to before, anyway!)

useful|3 years ago

Both my 3090s have heat problems. I would avoid the 3090 and 4090.

Get a 3080ti or 3060 with 12gb of ram. If you want to experiment with very high memory models then pay for colab pro.

rjh29|3 years ago

What manufacturers are your 3090 from? I heard the Founders Edition has memory issues, but they can be fixed by replacing the heating pads?

joshvm|3 years ago

Is there a reason you can't use Colab? That would be the first suggestion. A pro license costs about 600 a year.

Next question is what are you training?

The latest cards are almost certainly faster, but you're normally concerned by VRAM than absolute speed. The 3090 (or the older Titan series) is unique in consumer cards in that you get almost double the RAM of the next cards down. I've had no problem training models on an 1080ti 8GB card but I wouldn't want to go smaller than that. System RAM is cheap in comparison, you can easily spec a 128GB machine, M.2 SSDs are a must for fast data loading, etc. Most CPUs are quads now, but more cores and lots of RAM means you can run more parallel dataloaders.

Otherwise there are plenty of guides that do a cost analysis of which cards to buy. If money no object then the 3090/Titan series, but maybe you don't need that much RAM. I saw a sibling comment mentions issues with heat - the stock 3090 has terrible VRAM cooling. I water cooled mine and it works well, but it's not cheap at all.

mengibar10|3 years ago

Thanks for the reply. Also, I have the tendency to become very conscious when I use a paid service and kind of messes my mind. I think in the long run it will be cheaper to own a machine.

Actually I am more into RL than DL. But DRL use DL.

I don't need a machine at the moment, I am in the learning state at the moment. But eventually I will need something and passively want to learn on building my own machine. So trying to avoid costly mistakes.

It will be NVIDIA cards for sure. It does not have to be the latest, but I would like to be able to upgrade the cards without need to change the rest of the machine, at least not more than 10% of initial cost. I am very ignorant on this that's why asked how to build a machine starting with cheaper graphics cards but then be able to upgrade to latest and the best if I need to. I guess motherboard is the most important component but which one? Also, size of the box. Any article that discuss these in depth? Not videos, I prefer reading.

Money is not an issue unless I don't make the mistake of buying wrong components that does not fit both physically and compatibility.

I would like to build a compact machine yet able to upgrade, for example 2 graphics card eventually.

WithinReason|3 years ago

You should go for lots of memory and Nvidia for CUDA.