top | item 43188349

(no title)

EarlyOom | 1 year ago

You can! it works with Ollama https://github.com/vlm-run/vlmrun-hub

At the end of the day its just schemas. You can decide for yourself if its work upgrading to a larger, more expensive model.

discuss

order

No comments yet.