top | item 34735583

(no title)

kingcai | 3 years ago

Very uncommon for people to do ML locally nowadays. Almost everyone uses either cloud or has an cluster they ssh into. NVIDIA is still king for ML, very few people use AMD / M1, although hopefully that will change.

If you want to buy a GPU for ML this is a good resource: https://timdettmers.com/2023/01/30/which-gpu-for-deep-learni...

Basically buy the NVIDIA GPU with the largest amount of VRAM that's in your budget.

discuss

order