So this article is telling us that the iPhone app for ChatGPT is communicating with the ChatGPT APIs and therefore has the same privacy problems as the website or a third party app?
Feels kind of like someone was forced to write an article about this and had nothing really to say that hasn't been said dozens of times in other articles about ChatGPT already.
PS - The biggest thing ChatGPT is currently missing is the ability to read out answers.
> So this article is telling us that the iPhone app for ChatGPT is communicating with the ChatGPT APIs and therefore has the same privacy problems as the website or a third party app?
I would frame it like this:
It's not telling us, but it's telling our less tech-savvy friends and family to be cautious.
Not everyone will be as clear yet on how this all works and what it actually means to type text into ChatGPT, compared to the understanding that many of us here all have. I think its a point worth repeating at this early stage.
And I'd also argue that the official ChatGPT app is still better than users searching for ChatGPT on an app store and downloading an unofficial app with an even sketchier privacy policy / ads / malware.
TechRadar should be auto-ghosted on HN. It's a trash editorial run by incompetent editors who don't bother with acknowledging criticism. According to TR, the best web development tool in 2023 is Sketch and InVision. Neither are related to development, and Sketch is a Mac OS only app on top of that.
These are the same idiots who you can thank for Google search results being as dogshit as they are. The said tools article is ranked on first page by Google.
1. I agree with your PS. If it had voice to text and text to voice it would legit be Jarvis.
2. There's a lot of discussion about data staying "on your device" lately and non-technical people may not understand the nuances. So a lot of people may not realize that - yes that data is going off device and might benefit from the article
ChatGPT is free but users pay with their data.
Using the OpenAI API on the other hand costs money, but users data (the prompts they send and the replies they receive) is not used for further model training.
Users can enter their own API key (this implies they have an OpenAI account with a payment method connected) and pay directly OpenAI, based on their usage. The app has no tracking, monitoring, data gathering etc and all prompts are sent directly to the OpenAI API in this case.
Users that don't have an OpenAI account (or can't create one) can alternatively buy in-app tokens, though, to cover for the "apple tax", those are a bit more expensive than the OpenAI tokens.
In this scenario, the prompt is sent to a (very minimal) proxy that appends my API key to the request.
I wonder if Apple will have anything to show with on-device LLMs at WWDC in a couple weeks, or if that's a feature for another year?
Definitely feels like something where a lot of apps will want to take advantage of them, and there's some amount of space saving and small developer approachability from building one in as a system service, rather than everybody bringing their own.
Useful LLMs are probably still a tad to large to run locally on device. Loading and unloading the model takes several seconds, so ideally the model would just sit in RAM at all times.
In my experience playing with local LLaMA variants, ChatGPT level responses are really only possible with 13B parameter models and up. With 4bit quantization that's 6.5GB of RAM usage just for the model weights. You could totally run that on a phone today, but you'll be getting 2 tokens/s and using the majority of the phone's RAM.
A responsive, local LLM on a phone is going to happen someday soon, but we need some serious hardware improvements first. I'd speculate you'll need at least 24GB of shared memory, and considerably more dedicated tensor multiplication cores.
Alternatively, some sort of GPT architecture change that reduces the memory overhead could also drive local LLMs. Who knows, maybe Apple has a trick or two up their sleeves.
I think this is a sensible kind of warning for the average consumer (ie. not necessarily the tech savvy audience here) who may not realise the full implications of this app and how their data could be used.
For many this app may be their first exposure to using ChatGPT and the nature of it being an app that looks like any other chat app on your phone may lead some to incorrectly assume some degree of privacy over what they share.
The article is right also to point out that the OpenAI's anonymisation can only go so far if users unwittingly (or deliberately) enter personal deatils into the chat messages directly.
What about the privacy issue that you can’t use it without giving them your phone number? I downloaded the app hoping I could finally use it since the app supports Sign-in with AppleId, but it still wants my phone number.
It’s fairly understandable that they collect user data and I don’t care about that really.
Using free services always comes with some personal info extractions.
[+] [-] Someone1234|2 years ago|reply
Feels kind of like someone was forced to write an article about this and had nothing really to say that hasn't been said dozens of times in other articles about ChatGPT already.
PS - The biggest thing ChatGPT is currently missing is the ability to read out answers.
[+] [-] tailspin2019|2 years ago|reply
I would frame it like this:
It's not telling us, but it's telling our less tech-savvy friends and family to be cautious.
Not everyone will be as clear yet on how this all works and what it actually means to type text into ChatGPT, compared to the understanding that many of us here all have. I think its a point worth repeating at this early stage.
[+] [-] prophesi|2 years ago|reply
[+] [-] skilled|2 years ago|reply
These are the same idiots who you can thank for Google search results being as dogshit as they are. The said tools article is ranked on first page by Google.
[+] [-] AndrewKemendo|2 years ago|reply
2. There's a lot of discussion about data staying "on your device" lately and non-technical people may not understand the nuances. So a lot of people may not realize that - yes that data is going off device and might benefit from the article
[+] [-] tagyro|2 years ago|reply
ChatGPT is free but users pay with their data. Using the OpenAI API on the other hand costs money, but users data (the prompts they send and the replies they receive) is not used for further model training.
Additionally, when using ChatGPT, one can select to disable chat history, which prevents using the data for model training - https://help.openai.com/en/articles/7730893-data-controls-fa...
OpenAI does add some (imo, unnecessary) barriers to this, but that's another story.
Disclaimer: I developed AKME - https://apps.apple.com/app/akme-ai-knowledge/id6446436196 specifically for users that want to use the OpenAI API, but would rather not have their data used to further train models.
Users can enter their own API key (this implies they have an OpenAI account with a payment method connected) and pay directly OpenAI, based on their usage. The app has no tracking, monitoring, data gathering etc and all prompts are sent directly to the OpenAI API in this case.
Users that don't have an OpenAI account (or can't create one) can alternatively buy in-app tokens, though, to cover for the "apple tax", those are a bit more expensive than the OpenAI tokens. In this scenario, the prompt is sent to a (very minimal) proxy that appends my API key to the request.
[+] [-] thund|2 years ago|reply
[+] [-] wlesieutre|2 years ago|reply
Definitely feels like something where a lot of apps will want to take advantage of them, and there's some amount of space saving and small developer approachability from building one in as a system service, rather than everybody bringing their own.
[+] [-] valine|2 years ago|reply
In my experience playing with local LLaMA variants, ChatGPT level responses are really only possible with 13B parameter models and up. With 4bit quantization that's 6.5GB of RAM usage just for the model weights. You could totally run that on a phone today, but you'll be getting 2 tokens/s and using the majority of the phone's RAM.
A responsive, local LLM on a phone is going to happen someday soon, but we need some serious hardware improvements first. I'd speculate you'll need at least 24GB of shared memory, and considerably more dedicated tensor multiplication cores.
Alternatively, some sort of GPT architecture change that reduces the memory overhead could also drive local LLMs. Who knows, maybe Apple has a trick or two up their sleeves.
[+] [-] dghlsakjg|2 years ago|reply
Although, it would be pretty cool to run Siri locally. And also if she didn’t suck
[+] [-] ezfe|2 years ago|reply
[+] [-] tailspin2019|2 years ago|reply
For many this app may be their first exposure to using ChatGPT and the nature of it being an app that looks like any other chat app on your phone may lead some to incorrectly assume some degree of privacy over what they share.
The article is right also to point out that the OpenAI's anonymisation can only go so far if users unwittingly (or deliberately) enter personal deatils into the chat messages directly.
[+] [-] dghlsakjg|2 years ago|reply
It isn’t exclusive to the app, and, frankly, it’s kind of nice that they warn you up front instead of burying it in the terms.
[+] [-] antijava|2 years ago|reply
[+] [-] meatjuice|2 years ago|reply