top | item 46947321

(no title)

mrbigbob | 20 days ago

reminds me when google introduced radar for their pixel phones called project soli. https://research.google/blog/soli-radar-based-perception-and.... i have a feeling that these will be as successful. its a solution in search of a problem in my eye

discuss

order

Someone1234|20 days ago

I still own a Google device with that tech on it (Home Display), and, yeah it isn't useful. They just hide certain UI elements until your hand gets close, which is obnoxious and feels like they invented something then invented a usage for it to justify it.

UI should be consistent, it allows users to learn a muscle memory, this "hide stuff until you're 20cm away" stuff is the antithesis of that (and all good design in general).

jpalomaki|20 days ago

Would be quite handy for gesture control. When wearing thick gloves you need to take them off to operate the current AirPods.

yesfitz|20 days ago

This was a solved problem in the 1st and 2nd generation of AirPods with tap controls[1]. I'm still surprised that they removed that feature in favor of pressure, although now that I'm reflecting more on it, I wonder if it's part of Apple using their manufacturing and engineering as a moat[2]. i.e. Tap controls are relatively easy, so once wireless earbuds became commodities, they had to figure out some way to differentiate themselves.

That said, as someone who does pottery (messy hands), wears gloves/hats (stuff in the way), and has relatively poor fine motor control, I guess I welcome any solution that doesn't mean getting clay or cold air in my hair/ear.

The battery consumption and latency of the IR cameras will be interesting though. Too sensitive, and you'll eat up your battery. Not sensitive enough, and UX suffers.

1: https://support.apple.com/en-us/102628 2: https://news.ycombinator.com/item?id=45186975