top | item 44108080

Transformer Lab Now Works with AMD GPUs

2 points| aliasaria | 9 months ago |old.reddit.com

8 comments

order

aliasaria|9 months ago

Getting ROCm working was... an adventure. We documented the entire (painful) journey in a detailed blog post because honestly, nothing went according to plan. If you've ever wrestled with ROCm setup for ML, you'll probably relate to our struggles.

The good news? Everything works smoothly now! We'd love for you to try it out and see what you think.

latchkey|9 months ago

Reading your post now, half the article feels like it is just installing PyTorch. Next time, just use the pre-built docker containers. It is the recommended way and much easier.

https://rocm.docs.amd.com/projects/install-on-linux/en/lates...

Additionally, our MI300x VMs/machines come with ROCm installed and configured already. We also apply all the default recommended BIOS settings as well.

latchkey|9 months ago

No need to build your own box, we've got 1xMI300x VMs, for FREE (thanks to AMD), for development exactly like this. Reach out and we can get you set up.

Someone left a comment accusing me of advertising my business, then deleted it. If that’s how it came across, I apologize, but my intention was to offer something genuinely useful, for free, and directly relevant to helping the OP. Those credits weren’t easy to get, it required going all the way up to Lisa. I’m committed to making supercompute accessible to developers. Yes, it’s free like GitHub is free. But this isn’t a sales pitch.