top | item 39841837 (no title) mattwad | 1 year ago i run a home media server, can't wait to be able to add my own LLM service. it's just a matter of time for it to be something i can install over the weekend with proper hardware discuss order hn newest proaralyst|1 year ago Have you tried https://ollama.com/ ? You may find you already can gryfft|1 year ago git clone https://github.com/ggerganov/llama.cpp cd llama.cpp make ./server -m models/7B/ggml-model.gguf -c 2048 I don't think it'll take you the whole weekend :)
gryfft|1 year ago git clone https://github.com/ggerganov/llama.cpp cd llama.cpp make ./server -m models/7B/ggml-model.gguf -c 2048 I don't think it'll take you the whole weekend :)
proaralyst|1 year ago
gryfft|1 year ago