top | item 46708002

Self-hosted AI data workflow: DB and Ollama and SQL

11 points| exasol_nerd | 1 month ago |exasol.github.io

3 comments

order

exasol_nerd|1 month ago

I wrote a tutorial for invoking Mistral 7B model directly with SQL using Python UDFs in Exasol and Ollama. This demonstrates a fully self-hosted AI pipeline where data never leaves your infrastructure—no API fees, no vendor lock-in. Takes ~15 minutes to set up with Docker.

pploug|1 month ago

purely curious, but why did you go with ollama instead of the built in LLM runner in docker, since you are also using docker?