top | item 47095481

(no title)

ajuhasz | 9 days ago

Agreed, while we've tried to think through this and build in protections we can't pretend that there is a magical perfect solution. We do have strong conviction that doing this inside the walls of your home is much safer than doing it within any companies datacenter (I accept that some just don't want this to exist period and we won't be able to appease them).

Some of our decisions in this direction:

  - Minimize how long we have "raw data" in memory
  - Tune the memory extraction to be very discriminating and err on the side of forgetting (https://juno-labs.com/blogs/building-memory-for-an-always-on-ai-that-listens-to-your-kitchen)
  - Encrypt storage with hardware protected keys (we're building on top of the Nvidia Jetson SOM)
We're always open to criticism on how to improve our implementation around this.

discuss

order

sixtyj|3 days ago

Device should have been accompanied with a lot of examples so people are really aware how stored data could be misused. Alexa or any other similar device - their users are technically illiterate. Do you remember leaks of movie stars’ iPhone images? Multiply it by thousands… Court order, burglars, hackers - all bad actors imaginable…

For you, as producer, those situations can be a nightmare if not well described in operating conditions. And devices should not be pre-setup (don’t be “Google-evil”, as they track everything if you don’t set it up different; and it is always hidden deep in the third level menu under 2-steps verification)

bossyTeacher|8 days ago

> - Minimize how long we have "raw data" in memory

I believe you should allow people to set how long the raw data should be stored as well as dead man switches.