top | item 46895014 Yet another reminder why you should not use Ollama 6 points| dcreater | 25 days ago |github.com 5 comments order hn newest dcreater|25 days ago Georgi's relevant comment: https://github.com/ggml-org/llama.cpp/pull/19324#issuecommen... kermatt|24 days ago Can someone add some context as to what that diff is showing? dcreater|25 days ago and use the original llama.cpp directly. Its infinitely more easy to setup and use now tmtvl|25 days ago Setting up ollama is 2 steps:1. yay -S ollama2. sytemctl enable --now ollamaHow is llama.cpp infinitely more easy to set up? load replies (1)
dcreater|25 days ago Georgi's relevant comment: https://github.com/ggml-org/llama.cpp/pull/19324#issuecommen... kermatt|24 days ago Can someone add some context as to what that diff is showing?
dcreater|25 days ago and use the original llama.cpp directly. Its infinitely more easy to setup and use now tmtvl|25 days ago Setting up ollama is 2 steps:1. yay -S ollama2. sytemctl enable --now ollamaHow is llama.cpp infinitely more easy to set up? load replies (1)
tmtvl|25 days ago Setting up ollama is 2 steps:1. yay -S ollama2. sytemctl enable --now ollamaHow is llama.cpp infinitely more easy to set up? load replies (1)
dcreater|25 days ago
kermatt|24 days ago
dcreater|25 days ago
tmtvl|25 days ago
1. yay -S ollama
2. sytemctl enable --now ollama
How is llama.cpp infinitely more easy to set up?