top | item 44769113 (no title) om8 | 7 months ago To have a gpu inference, you need a gpu. I have a demo that runs 8B llama on any computer with 4 gigs of ramhttps://galqiwi.github.io/aqlm-rs/about.html discuss order hn newest adastra22|7 months ago Any computer with a display has a GPU. om8|7 months ago Sure, but integrated graphics usually lacks vram for LLM inference. load replies (1)
adastra22|7 months ago Any computer with a display has a GPU. om8|7 months ago Sure, but integrated graphics usually lacks vram for LLM inference. load replies (1)
om8|7 months ago Sure, but integrated graphics usually lacks vram for LLM inference. load replies (1)
adastra22|7 months ago
om8|7 months ago