top | item 37033104

(no title)

MediumOwl | 2 years ago

There's only mention of Nvidia GPUs on the web site, what about AMD?

discuss

order

lhl|2 years ago

On Windows, llama.cpp has OpenCL support (CLBlast) and MLC LLM (https://mlc.ai/mlc-llm/docs/) has Vulkan acceleration.

On Linux, ExLlama and MLC LLM have native ROCm support, and there is a HIPified fork of llama.cpp as well.

zzbzq|2 years ago

Don't really work for AI. There might be a weird experimental driver for linux or something, I never got it to work.

MediumOwl|2 years ago

Why not?