(no title)
igor_akhmetov | 2 years ago
I guess the main reason the plugin is enabled by default is to promote it. AI tools are becoming an important part of the development workflow, with a majority of developers using a chat LLM or Copilot (https://www.jetbrains.com/lp/devecosystem-2023/ai/). So the functionality offered by the plugin might be interesting to a large part of the user base. A non-bundled plugin would not be discovered by most users. And for organizations it looks like the AI assistant is disabled by default.
Regarding the direction - LLMs are usually used as a service right now, so there's not much choice. At the same time, JetBrains is also working on local smart completion workflows like https://plugins.jetbrains.com/plugin/14823-full-line-code-co.... Hopefully at some point there'll also be an option to use a local LLM with the AI Assistant.
donmcronald|2 years ago
> Hopefully at some point there'll also be an option to use a local LLM with the AI Assistant.
I would love something where I could pick the inputs or at least give it inputs that skew results. Right now I think the big problem with AI is bad input. Most of the dataset that's publicly available is junk. A lot of "answers" are people guessing at stuff they don't understand.
Honestly, I don't even know if (or how) LLMs work if you train them with a smaller, more accurate dataset.
mardifoufs|2 years ago
igor_akhmetov|2 years ago
Nobody forces stuff onto you. You can disable everything with just one checkbox.