(no title)
ggerganov | 1 year ago
I highly recommend to take a look at the technical details of the server implementation that enables large context usage with this plugin - I think it is interesting and has some cool ideas [0].
Also, the same plugin is available for VS Code [1].
Let me know if you have any questions about the plugin - happy to explain. Btw, the performance has improved compared to what is seen in the README videos thanks to client-side caching.
amrrs|1 year ago
kennethologist|1 year ago
sergiotapia|1 year ago
halyconWays|1 year ago
[deleted]
bangaladore|1 year ago
Is this because of a max latency setting, or the internal prompt, or am I doing something wrong? Or is it only really make to try to autocomplete lines and not blocks like Copilot will.
Thanks :)
ggerganov|1 year ago
- Generation time exceeded (configurable in the plugin config)
- Number of tokens exceeded (not the case since you increased it)
- Indentation - stops generating if the next line has shorter indent than the first line
- Small probability of the sampled token
Most likely you are hitting the last criteria. It's something that should be improved in some way, but I am not very sure how. Currently, it is using a very basic token sampling strategy with a custom threshold logic to stop generating when the token probability is too low. Likely this logic is too conservative.
eklavya|1 year ago
jerpint|1 year ago
liuliu|1 year ago
Just curious: how much of your code nowadays completed by LLM?
ggerganov|1 year ago
I think a fairly large amount, though can't give a good number. I have been using Github Copilot from the very early days and with the release of Qwen Coder last year have fully switched to using local completions. I don't use the chat workflow to code though, only FIM.
attentive|1 year ago
If so, what's ollama missing?
mistercheph|1 year ago
There is also https://github.com/olimorris/codecompanion.nvim which doesn't have text completion, but supports a lot of other AI editor workflows that I believe are inspired by Zed and supports ollama out of the box
nancyp|1 year ago
nacs|1 year ago
You can use C or VIMscript but programs like Neovim support Lua as well which makes it really easy to make plugins.
halyconWays|1 year ago
unknown|1 year ago
[deleted]