top | item 40444005

(no title)

ultrasaurus | 1 year ago

+100 to this, I don't think many people reading this thread realize how easy they've made it to run a LLM locally. It's a great start if you want to kick multiple tires (be careful to clean up! the gigs add up).

> wget https://huggingface.co/jartine/TinyLlama-1.1B-Chat-v1.0-GGUF...

> chmod +x TinyLlama-1.1B-Chat-v1.0.Q5_K_M.llamafile

> ./TinyLlama-1.1B-Chat-v1.0.Q5_K_M.llamafile -ngl 999

https://euri.ca/blog/2024-llm-self-hosting-is-easy-now/

discuss

order

No comments yet.