(no title)
fprotthetarball | 2 years ago
After installing the application and running it, you run "ollama run <model name>" and it handles everything and drops you into a chat with the LLM. There are no dependencies for you to manage -- just one application.
Check out the README: https://github.com/jmorganca/ollama#readme
No comments yet.