Dear community
I am currently exploring the feasibility of replacing Nvidia GPUs with AMD GPUs for LLM inference. Can anyone share their experience or point me to any relevant research on the performance differences between these GPUs for this task? Are there any particular hardware or software limitations that may affect the feasibility of such a switch? Thank you for your insights!
cjbprime|2 years ago
textcortex|2 years ago
LorenDB|2 years ago
textcortex|2 years ago
htrp|2 years ago
InitEnabler|2 years ago
textcortex|2 years ago
Havoc|2 years ago
brudgers|2 years ago
Can that rationale be mitigated by increasing revenue as an alternative to adding the technical risks implicit in your question?
I mean it is one thing if this is a hobby project and using AMD will provide interesting challenges to keep yourself occupied. How you spend your own time and money is entirely up to you.
But spending other people’s money is another thing entirely (unless they are mom and dad). And even more so spending other people’s time particularly when it comes to paychecks.
Finally, I am not saying there aren’t business cases where AMD doesn’t make sense. Just that Nvidia is the default for good reasons and is the simplest thing that might work. Good luck.