(no title)
Lanedo | 1 year ago
I hacked up a slim alternative localpilot.js layer that uses llama-server instead of the copilot API, so copilot.el can be used with local LLMs, but I find the copilot.el overlays kinda buggy... It'd probably be better to instead write a llamapilot.el for local LLMs from scratch for emacs.
No comments yet.