Super annoying. I have an RX 6600 XT and can't get ROCm to work on Linux.
Vulkan ML however worked perfectly out of the box, so at least I got something.
The caveat being that PyTorch has a lot of dependencies and a couple of them are not yet available in Debian Unstable. For folks wanting to use StableDiffusion, that's a problem. However, the available packages are more than sufficient for llama-cpp as you point out.
suprjami|1 year ago
https://github.com/superjamie/rocswap
slavik81|1 year ago