> if Apple is providing raw eye tracking streams to app developers
Apple is not doing that. As the article describes, the issue is that your avatar (during a FaceTime call, for example) accurately reproduces your eye movements.
Isn't it the a distinction without a difference ? Apple isn't providing your real eye movements, but an 1 to 1 reproduction of what it tracks as your eye movements.
The exploit requires analysing the avatar's eyes, but as they're not the natural movements but replicated ones, there should be a lot less noise. And of course as you need to intentionally focus on specific UI targets, these movements are even less natural and fuzzy than if you were looking at your keyboard while typing.
The difference is that you can't generalize the attack outside of using Personas, a feature which is specifically supposed to share your gaze with others. Apps on the device still have no access to what you're looking at, and even this attack can only make an educated guess.
This is a great example of why ‘user-spacey’ applications from the OS manufacturer shouldn’t be privileged beyond other applications: Because this bypasses the security layer while lulling devs into a false sense of security.
> ‘user-spacey’ applications from the OS manufacturer shouldn’t be privileged beyond other applications
I don't think that's an accurate description, either. The SharePlay "Persona" avatar is a system service just like the front-facing camera stream. Any app can opt into using either of them.
The technology to reproduce eye movements has been around since motion pictures were invented. I'm sure even a flat video stream of the user's face would leak similar information.
Apple should have been more careful about allowing any eye motion information (including simple video) to flow out of a system where eye movements themselves are used for data input.
The problem is the edge case where it's used for two different things with different demands at the same time, and the fix is to...not do that.
> Apple fixed the flaw in a Vision Pro software update at the end of July, which stops the sharing of a Persona if someone is using the virtual keyboard.
makeitdouble|1 year ago
The exploit requires analysing the avatar's eyes, but as they're not the natural movements but replicated ones, there should be a lot less noise. And of course as you need to intentionally focus on specific UI targets, these movements are even less natural and fuzzy than if you were looking at your keyboard while typing.
dialup_sounds|1 year ago
unknown|1 year ago
[deleted]
taneq|1 year ago
simondw|1 year ago
I don't think that's an accurate description, either. The SharePlay "Persona" avatar is a system service just like the front-facing camera stream. Any app can opt into using either of them.
FrustratedMonky|1 year ago
simondw|1 year ago
Apple should have been more careful about allowing any eye motion information (including simple video) to flow out of a system where eye movements themselves are used for data input.
dialup_sounds|1 year ago
The problem is the edge case where it's used for two different things with different demands at the same time, and the fix is to...not do that.
> Apple fixed the flaw in a Vision Pro software update at the end of July, which stops the sharing of a Persona if someone is using the virtual keyboard.