top | item 44022518

(no title)

matula | 9 months ago

Very nice. I tried with Ollama and it works well.

The biggest issue is having the Ollama models hardcoded to Qwen3 and Llama 3.1. I imagine most Ollama users have their favorites, and probably vary quite a bit. My main model is usually Gemma 3 12B, which does support images.

It would be a nice feature to have a custom config on the Ollama settings page, save those to Chrome storage, and use that in the 'getAvailableModels' method, along with the hardcoded models.

discuss

order

parsabg|9 months ago

Great suggestion, will add custom Ollama configurations to the next release