top | item 46708077

(no title)

pploug | 1 month ago

purely curious, but why did you go with ollama instead of the built in LLM runner in docker, since you are also using docker?

discuss

order

exasol_nerd|1 month ago

great idea! I went with Ollama because I found set up to be slightly easier. But technically both should offer the same experience and altogether - hosting both in Docker is very logical. That will be the next iteration of my write up!