top | item 43713362

(no title)

thenickdude | 10 months ago

The free tier now supports connecting to local AI models running on LM Studio or Ollama, but it still doesn't actually function without an internet connection.

If you block access to the internet or to their AI API servers [1], it refuses to start a new chat invocation. If you block access halfway through a conversation, the conversation continues just fine, so there's no technical barrier to them actually running offline, they just don't allow it.

Their settings page also says that they can't even guarantee that they implemented the offline toggle properly, a flag that should be the easiest thing in the world to enforce:

>Prevents most remote calls, prioritizing local models. Despite these safeguards, rare instances of cloud usage may still occur.

So you can't even block access to the very servers that they say their faulty offline toggle would leak data to.

[1] https://www.jetbrains.com/help/ai-assistant/disable-ai-assis...

discuss

order

eclectric|10 months ago

I disconnect from the internet sometimes and noticed this morning that my previous night's chat was invisible. I could only see it once I connected again.

This puts me off a bit to finally try local models. Anyone know what kind data is collected in those rare instances of cloud usage?