(no title)
sieve | 4 months ago
Getting them to work and recognize my GPU without passing arcane flags was a problem. I could at least avoid the pain with llama-cpp because of its vulkan support. pytorch apparently doesn't have a vulkan backend. So I decided to roll out my own wgpu-py one.
rpdillon|4 months ago
danielmarkbruce|4 months ago