top | item 44663087

(no title)

zhobbs | 7 months ago

Possible I'm misunderstanding what you're trying to do, but ollama works well for me for local inference with qwen on my Macbook Pro (32GB).

discuss

order

nateb2022|7 months ago

Yup, also using Ollama and on a Macbook Pro. Ollama is #1

p0w3n3d|7 months ago

But isn't ollama only local chat? Or I am missing something? I'd like to setup it as a server for my usages on another laptop (use it as my local AI hub) and would love to integrate it with some IDE using MCP