(no title)
nuwandavek | 1 year ago
> Are your LLMs running entirely locally on your own hardware, and if not, how can you say the data is not shared with third parties? (EDIT: you mentioned GPT-4o in another comment so this statement cannot be correct.)
We're currently only using API providers (OAI + Claude) that do not themselves train on data accessed through APIs. Although they are technically third parties, they're not third parties that harvest data.
I recognize that even this may just be empty talk. We're currently working on 2 efforts that I think will further help here:
- opensourcing the entire extension so that users can see exactly what data is being used as LLM context (and allow users to extend the app further)
- support local models so that your data never leaves your computer (ETA for both is ~1-2 weeks)
We are genuinely motivated by the excitement + concerns you may have. We want to give an assistant-in-the-browser alternative to people who don't want to move to AI-native-data-locked-in platforms. I regret that was not transparent in our copy.
Thanks for pointing the error in the FAQs, we somehow missed it. It is fixed now!
No comments yet.