I don't think the point here should be the cost, but the fact that you are sending everything you write to OpenAI to train their models on your information. The option of a local model allows you to preserve the privacy of what you write.
I like that.
zlwaterfield|1 year ago
Tepix|1 year ago
That's super useful for people who run say ollama with an nginx reverse proxy in front of it (that adds authentication).
nickthegreek|1 year ago
paraknight|1 year ago
Szpadel|1 year ago
miguelaeh|1 year ago