top | item 36977146

(no title)

joebiden2 | 2 years ago

Ollama forks llama.cpp. The value-add is marginal. Still I see no attribution on https://ollama.ai/.

Please instead of downvoting, see if this is fine from your point of view. No affiliation at all, I just don't like this kind of marketing.

See also https://news.ycombinator.com/item?id=36806448

discuss

order

joshstrange|2 years ago

It would be nice to add some attribution but llama.cpp is MIT licensed so what Ollama is doing is perfectly acceptable. Also, Ollama is open source (also MIT). You can bet any for-profit people using llama.cpp under the hood aren't going to mention it, and while I think we should hold open source projects to a slightly higher standard this isn't really beyond the pale for me.

While you find the value-add to be "marginal" I wouldn't agree. In the linked comment you say "setting up llama.cpp locally is quite easy and well documented" ok, but it's still nowhere near as fast/easy to setup as Ollama, I know, I've done both.

jokethrowaway|2 years ago

Running make vs go build? I don't see much difference

I personally settled on the text-generation-webui