top | item 43920740 (no title) jt_b | 9 months ago I haven't/wouldn't use it because I have a decent K8S ollama/open-webui setup, but docker announced this a month ago: https://www.docker.com/blog/introducing-docker-model-runner discuss order hn newest nicce|9 months ago Hmm, I guess that is not actually running inside container/ there is no isolation. Some kind of new way that mixes llama.cpp , OCI format and docker CLI.
nicce|9 months ago Hmm, I guess that is not actually running inside container/ there is no isolation. Some kind of new way that mixes llama.cpp , OCI format and docker CLI.
nicce|9 months ago