top | item 46355340

(no title)

Lapel2742 | 2 months ago

>Only issue I have found with llama.cpp is trying to get it working with my amd GPU.

I had no problems with ROCm 6.x but couldn't get it to run with ROCm 7.x. I switched to Vulkan and the performance seems ok for my use cases

discuss

order

No comments yet.