top | item 46717847

(no title)

replete | 1 month ago

Run server with ollama, use Continue extension configured for ollama

discuss

order

BoredomIsFun|1 month ago

I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.

mika6996|1 month ago

But you can't just switch between installed models like in ollama, can you?