top | item 46716773 (no title) jedisct1 | 1 month ago Really cool.But how to use it instead of Copilot in VSCode ? discuss order hn newest flanked-evergl|1 month ago Would love to know myself, I recall there was some plugin for VSCode that did next edits that accepted a custom model but I don't recall what it was now. replete|1 month ago Run server with ollama, use Continue extension configured for ollama BoredomIsFun|1 month ago I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible. load replies (1)
flanked-evergl|1 month ago Would love to know myself, I recall there was some plugin for VSCode that did next edits that accepted a custom model but I don't recall what it was now.
replete|1 month ago Run server with ollama, use Continue extension configured for ollama BoredomIsFun|1 month ago I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible. load replies (1)
BoredomIsFun|1 month ago I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible. load replies (1)
flanked-evergl|1 month ago
replete|1 month ago
BoredomIsFun|1 month ago