(no title)
BoredomIsFun | 7 days ago
If decides to do so, and even then baked in knowledge would influence the result.
In any case I do not need Gemini or any other LLMs to figure out setting for my llama.cpp, thank you very much.
BoredomIsFun | 7 days ago
If decides to do so, and even then baked in knowledge would influence the result.
In any case I do not need Gemini or any other LLMs to figure out setting for my llama.cpp, thank you very much.
seanmcdirmid|7 days ago
If you are able to figure out the right settings for a model Thats was released last week, then great for you! But it sounds like you just don’t trust LLMs to use current knowledge, and have some misconception about how they satisfy recent knowledge requests.