(no title)
BoredomIsFun | 7 days ago
That would besuboptimal, as Gemini has too old knowledge cutoff. I am long past the need for such an advice anyway, as I've been using local models since mid 2024.
BoredomIsFun | 7 days ago
That would besuboptimal, as Gemini has too old knowledge cutoff. I am long past the need for such an advice anyway, as I've been using local models since mid 2024.
seanmcdirmid|7 days ago
It’s only a very low level model access where search isn’t used. Local models also need to be configured to use search, and I haven't had a use case to do that yet.
Gemini seems to call this “grounding with google search”. If you have Gemini installed in your enterprise, it will also search internal data sources for context.
BoredomIsFun|7 days ago
If decides to do so, and even then baked in knowledge would influence the result.
In any case I do not need Gemini or any other LLMs to figure out setting for my llama.cpp, thank you very much.