top | item 37195543

(no title)

cschmid | 2 years ago

Can I also interpret this as: 'AMD's pytorch support is so abysmal that inference is 10x slower than it should be'?

discuss

order

croes|2 years ago

Should it not say PyTorch's AMD support?

dannyw|2 years ago

It takes two to tango. AMD is always welcome to contribute patches.

You also have to keep in mind some latest gen AMD GPUs don’t even officially support ROCm on Linux. That’s absurd.

AMD has a choice to invest more staff into ML support, they’re choosing not to.