All the VR/AR/XR demos are so insanely trivial and yet still manage to be much more difficult than current methods of doing things. Like, really, cooking?
Normal method:
* Search for a recipe
* Leave my phone on a stand and glance at it if I forget a step
Meta glasses:
* Put glasses on (there's a reason I got lasek, it's because wearing glasses sucks)
* Talk into the void, trying to figure out how to describe my problem as well as the format that I want the LLM to structure the response
* Correct it when it misreads one of my ingredients
* Hope that the rng gods give me a decent recipe
Or basically any of the things shown off for Apple's headset. Strap on a giant headset just so I can... browse photos? or take a video call where the other person can't even see my face?
I dunno, if these worked perfectly I don't think it'd be awful to be able to open my fridge and say "what can I make with this" and it could rattle of some suggestions based on my known preferences and even show me images in their new display.
Hands-free while cooking (not having to touch my phone with messy hands) is not a bad thing either.
Watching the announcement, every feature felt like something my phone already does—better.
With glasses, you have to aim your head at whatever you want the AI to see. With a phone, you just point the camera while your hands stay free. Even in Meta’s demo, the presenter had to look back down at the counter because the AI couldn’t see the ingredients.
It feels like the same dead end we saw with Rabbit and the Humane pin—clever hardware that solves nothing the phone doesn’t already do. Maybe there’s a niche if you already wear glasses every day, but beyond that it’s hard to see the case.
On the other hand, having to constantly consult a recipe on my phone while I cook is the main quality of life aspect of home cooking that could be improved.
You're missing the part where I'm reminded that my phone autolocks so I have to go into the settings in the middle of cooking to make it never autolock (or be lazy and unlock it every time I need it). And then I have to find a clean knuckle to scroll the ingredient list and the recipe steps every time I'm trying to remember what step I'm at.
You could do some killer recipe UX with a HUD on some glasses.
These companies are reaching really hard for use cases while ignoring the only ones VR actually works well for. If they just went all in on gaming it would be a much better product than trying to push AI slop cooking help.
Voice input is just too annoying but with the display and wristband I think the dream is there. Your hands are deep in messy food prep, you have a recipe up, you can still pause your music or take a call with the wristband and without stopping to wash up or getting oil or batter on everything.
I wear my glasses all the time. If I could just talk to the void and get help with things I’m directly seeing reliably that would be a game changer. I’ve used Gemini’s video mode and we’re not all that far away.
People dont realise how amazingly efficient touch interfaces already are.
THere is no need for these stupid glasses. Some refuse to accept it - especially Zuckerberg who relies on folks like Apple to make his money. Thats really whats driving this project if you tear away all the BS.
If you watch it carefully, he preempts the AI with "What do I do first" before it even answered the first time. This strongly suggests it did this in rehearsal to me and hence was far more than just "bad luck" or bad connectivity. Perhaps the bad connectivity stopped the override from working and it just kept repeating the previous response. Either way it suggests some troubling early implications about how well Meta's AI work is going to me, that they got this stuck on the main live demo for their flagship product on such a simple thing.
I think preempting the AI the first time was meant to be a feature (it's not trivial to implement and is something people often ask for). Failing from there definitely wasn't great, although it's kind of what I'd expect from an(y) LLM.
The way he clung to „what do I do first” makes me think that the whole conversation was scripted in the prompt and AI was asked to reply in specific way to specific sentences. Possibility not even actually connected to the camera?
> Either way it suggests some troubling early implications about how well Meta's AI work is going
I fully expect the AI to suck initially and then over many months of updates evolve to mostly annoying and only occasionally mildly useful.
However, the live stage demo failing isn't necessarily supporting evidence. Live stage demos involving Wifi are just hard because in addition to the normal device functionality they're demoing, they need to simultaneously compress and transmit a screen share of the final output back over wifi so the audience can see it. And they have to do all that in a highly challenging RF environment that's basically impossible to simulate in advance. Frankly, I'd be okay with them using a special headset that has a hard-wired data link for the stage demo.
I've done live demos of AI. Even with the same queries, I got a different answers than my 4 previous practice attempts. My demos keep me on my toes and I try to limit the scope much more now.
(I didn't have control over temperature settings.)
> (I didn't have control over temperature settings.)
That's...interesting. You'd think they'd dial the temperature to 0 for you before the demo at least. Regardless, if the tech is good, I'd hope all the answers are at least decent and you could roll with it. If not....then maybe it needs to stay in R&D.
If you’ve ever used the current Meta Ray Ban and AI, this almost exactly happens when the connection is bad. Pure confusion but the AI still tries to give you an answer.
I bet the device hardware is small/cheap and susceptible to interference
I have the Meta glasses and I've never noticed this, and don't even understand why it could be the connection's fault. The AI gets your audio and your image, if it gives the wrong answer, it's because the AI went wrong. How would the bad connection ever affect it?
Yeah I was also cringing at that cop out. It doesn’t appear connectivity related. Plus even if it was, it beautifully highlights the connectivity requirement which sucks for so many reasons.
Ouch. Kudos for trying, though. I miss the days of live demos at Apple events, instead of all these polished videos of people standing in silly poses around the Apple campus.
I have mad respect to them for actually attempting this on the fly - especially a public company. Nothing really to gain versus a scripted demo, and absolutely everything to lose. Admirable.
I saw Jobs give a demo of some NeXT technology and the system crashed and rebooted right in the middle of it. He just said “oops” and talked around it until the system came back up.
"At least it's not faked" was my main reaction, too. Some other big-tech AI-related demos the last couple years have been caught being faked.
Zuckerberg handling it reasonably well was nice.
(Though the tone at the end of "we'll go check out what he made later" sounded dismissive. The blame-free post-mortem will include each of the personnel involved in the failure, in a series of one-on-one MMA sparring rounds. "I'm up there, launching a milestone in a trillion-dollar strategic push, and you left me @#$*&^ my @#*$&^@#( like a #@&#^@! I'll show you post-mortem!")
Typical Meta product. I used to believe and wasted money on multiple generations of Quest & Ray-bans. I expect this device to be unsupported at launch, just like Quest Pro was
llmthrow0827|5 months ago
Normal method:
* Search for a recipe
* Leave my phone on a stand and glance at it if I forget a step
Meta glasses:
* Put glasses on (there's a reason I got lasek, it's because wearing glasses sucks)
* Talk into the void, trying to figure out how to describe my problem as well as the format that I want the LLM to structure the response
* Correct it when it misreads one of my ingredients
* Hope that the rng gods give me a decent recipe
Or basically any of the things shown off for Apple's headset. Strap on a giant headset just so I can... browse photos? or take a video call where the other person can't even see my face?
hdjrudni|5 months ago
Hands-free while cooking (not having to touch my phone with messy hands) is not a bad thing either.
jackbrookes|5 months ago
Imagine it’s 1992:
Cookbook: Open book, follow steps.
PC: Turn on tower, wait for DOS, fiddle with floppies, pray the printer works, hope the shareware recipe isn’t weird.
Not saying you're wrong but its easy to miss the big picture
twalichiewicz|5 months ago
With glasses, you have to aim your head at whatever you want the AI to see. With a phone, you just point the camera while your hands stay free. Even in Meta’s demo, the presenter had to look back down at the counter because the AI couldn’t see the ingredients.
It feels like the same dead end we saw with Rabbit and the Humane pin—clever hardware that solves nothing the phone doesn’t already do. Maybe there’s a niche if you already wear glasses every day, but beyond that it’s hard to see the case.
unknown|5 months ago
[deleted]
hombre_fatal|5 months ago
You're missing the part where I'm reminded that my phone autolocks so I have to go into the settings in the middle of cooking to make it never autolock (or be lazy and unlock it every time I need it). And then I have to find a clean knuckle to scroll the ingredient list and the recipe steps every time I'm trying to remember what step I'm at.
You could do some killer recipe UX with a HUD on some glasses.
SchemaLoad|5 months ago
jayd16|5 months ago
dyauspitr|5 months ago
rhetocj23|5 months ago
THere is no need for these stupid glasses. Some refuse to accept it - especially Zuckerberg who relies on folks like Apple to make his money. Thats really whats driving this project if you tear away all the BS.
zmmmmm|5 months ago
daemonologist|5 months ago
exitb|5 months ago
mrandish|5 months ago
I fully expect the AI to suck initially and then over many months of updates evolve to mostly annoying and only occasionally mildly useful.
However, the live stage demo failing isn't necessarily supporting evidence. Live stage demos involving Wifi are just hard because in addition to the normal device functionality they're demoing, they need to simultaneously compress and transmit a screen share of the final output back over wifi so the audience can see it. And they have to do all that in a highly challenging RF environment that's basically impossible to simulate in advance. Frankly, I'd be okay with them using a special headset that has a hard-wired data link for the stage demo.
unknown|5 months ago
[deleted]
explorigin|5 months ago
(I didn't have control over temperature settings.)
danjc|5 months ago
hdjrudni|5 months ago
That's...interesting. You'd think they'd dial the temperature to 0 for you before the demo at least. Regardless, if the tech is good, I'd hope all the answers are at least decent and you could roll with it. If not....then maybe it needs to stay in R&D.
santiagobasulto|5 months ago
I think there’s some respect to give cause they’re doing it live and non-scripted.
jansan|5 months ago
whywhywhywhy|5 months ago
TIPSIO|5 months ago
I bet the device hardware is small/cheap and susceptible to interference
stavros|5 months ago
m3kw9|5 months ago
krustyburger|5 months ago
303uru|5 months ago
klabb3|5 months ago
losvedir|5 months ago
HaZeust|5 months ago
xandrius|5 months ago
anal_reactor|5 months ago
klik99|5 months ago
But hey, at least it's not all faked
gretch|5 months ago
Pitches can be spun, data is cherry picked. But the proof is always in the pudding.
This is embarrassing for sure, but from the ashes of this failure we find the resolve to make the next version better.
SoftTalker|5 months ago
postalcoder|5 months ago
live demonstrations are tough - i wish apple would go back to them.
neilv|5 months ago
Zuckerberg handling it reasonably well was nice.
(Though the tone at the end of "we'll go check out what he made later" sounded dismissive. The blame-free post-mortem will include each of the personnel involved in the failure, in a series of one-on-one MMA sparring rounds. "I'm up there, launching a milestone in a trillion-dollar strategic push, and you left me @#$*&^ my @#*$&^@#( like a #@&#^@! I'll show you post-mortem!")
garbawarb|5 months ago
jjfoooo4|5 months ago
baby|5 months ago
unknown|5 months ago
[deleted]
anonu|5 months ago
herval|5 months ago
hattmall|5 months ago
m3kw9|5 months ago
joshdavham|5 months ago
bigtones|5 months ago
dxxmxnd|5 months ago
stavros|5 months ago