top | item 45283734

(no title)

wronglebowski | 5 months ago

The live demo of this is brutal. https://x.com/ns123abc/status/1968469616545452055

discuss

order

llmthrow0827|5 months ago

All the VR/AR/XR demos are so insanely trivial and yet still manage to be much more difficult than current methods of doing things. Like, really, cooking?

Normal method:

* Search for a recipe

* Leave my phone on a stand and glance at it if I forget a step

Meta glasses:

* Put glasses on (there's a reason I got lasek, it's because wearing glasses sucks)

* Talk into the void, trying to figure out how to describe my problem as well as the format that I want the LLM to structure the response

* Correct it when it misreads one of my ingredients

* Hope that the rng gods give me a decent recipe

Or basically any of the things shown off for Apple's headset. Strap on a giant headset just so I can... browse photos? or take a video call where the other person can't even see my face?

hdjrudni|5 months ago

I dunno, if these worked perfectly I don't think it'd be awful to be able to open my fridge and say "what can I make with this" and it could rattle of some suggestions based on my known preferences and even show me images in their new display.

Hands-free while cooking (not having to touch my phone with messy hands) is not a bad thing either.

jackbrookes|5 months ago

This reads a bit like like a pre-PC take: "Why use a computer when a cookbook works fine?"

Imagine it’s 1992:

Cookbook: Open book, follow steps.

PC: Turn on tower, wait for DOS, fiddle with floppies, pray the printer works, hope the shareware recipe isn’t weird.

Not saying you're wrong but its easy to miss the big picture

twalichiewicz|5 months ago

Watching the announcement, every feature felt like something my phone already does—better.

With glasses, you have to aim your head at whatever you want the AI to see. With a phone, you just point the camera while your hands stay free. Even in Meta’s demo, the presenter had to look back down at the counter because the AI couldn’t see the ingredients.

It feels like the same dead end we saw with Rabbit and the Humane pin—clever hardware that solves nothing the phone doesn’t already do. Maybe there’s a niche if you already wear glasses every day, but beyond that it’s hard to see the case.

hombre_fatal|5 months ago

On the other hand, having to constantly consult a recipe on my phone while I cook is the main quality of life aspect of home cooking that could be improved.

You're missing the part where I'm reminded that my phone autolocks so I have to go into the settings in the middle of cooking to make it never autolock (or be lazy and unlock it every time I need it). And then I have to find a clean knuckle to scroll the ingredient list and the recipe steps every time I'm trying to remember what step I'm at.

You could do some killer recipe UX with a HUD on some glasses.

SchemaLoad|5 months ago

These companies are reaching really hard for use cases while ignoring the only ones VR actually works well for. If they just went all in on gaming it would be a much better product than trying to push AI slop cooking help.

jayd16|5 months ago

Voice input is just too annoying but with the display and wristband I think the dream is there. Your hands are deep in messy food prep, you have a recipe up, you can still pause your music or take a call with the wristband and without stopping to wash up or getting oil or batter on everything.

dyauspitr|5 months ago

I wear my glasses all the time. If I could just talk to the void and get help with things I’m directly seeing reliably that would be a game changer. I’ve used Gemini’s video mode and we’re not all that far away.

rhetocj23|5 months ago

People dont realise how amazingly efficient touch interfaces already are.

THere is no need for these stupid glasses. Some refuse to accept it - especially Zuckerberg who relies on folks like Apple to make his money. Thats really whats driving this project if you tear away all the BS.

zmmmmm|5 months ago

If you watch it carefully, he preempts the AI with "What do I do first" before it even answered the first time. This strongly suggests it did this in rehearsal to me and hence was far more than just "bad luck" or bad connectivity. Perhaps the bad connectivity stopped the override from working and it just kept repeating the previous response. Either way it suggests some troubling early implications about how well Meta's AI work is going to me, that they got this stuck on the main live demo for their flagship product on such a simple thing.

daemonologist|5 months ago

I think preempting the AI the first time was meant to be a feature (it's not trivial to implement and is something people often ask for). Failing from there definitely wasn't great, although it's kind of what I'd expect from an(y) LLM.

exitb|5 months ago

The way he clung to „what do I do first” makes me think that the whole conversation was scripted in the prompt and AI was asked to reply in specific way to specific sentences. Possibility not even actually connected to the camera?

mrandish|5 months ago

> Either way it suggests some troubling early implications about how well Meta's AI work is going

I fully expect the AI to suck initially and then over many months of updates evolve to mostly annoying and only occasionally mildly useful.

However, the live stage demo failing isn't necessarily supporting evidence. Live stage demos involving Wifi are just hard because in addition to the normal device functionality they're demoing, they need to simultaneously compress and transmit a screen share of the final output back over wifi so the audience can see it. And they have to do all that in a highly challenging RF environment that's basically impossible to simulate in advance. Frankly, I'd be okay with them using a special headset that has a hard-wired data link for the stage demo.

explorigin|5 months ago

I've done live demos of AI. Even with the same queries, I got a different answers than my 4 previous practice attempts. My demos keep me on my toes and I try to limit the scope much more now.

(I didn't have control over temperature settings.)

hdjrudni|5 months ago

> (I didn't have control over temperature settings.)

That's...interesting. You'd think they'd dial the temperature to 0 for you before the demo at least. Regardless, if the tech is good, I'd hope all the answers are at least decent and you could roll with it. If not....then maybe it needs to stay in R&D.

TIPSIO|5 months ago

If you’ve ever used the current Meta Ray Ban and AI, this almost exactly happens when the connection is bad. Pure confusion but the AI still tries to give you an answer.

I bet the device hardware is small/cheap and susceptible to interference

stavros|5 months ago

I have the Meta glasses and I've never noticed this, and don't even understand why it could be the connection's fault. The AI gets your audio and your image, if it gives the wrong answer, it's because the AI went wrong. How would the bad connection ever affect it?

m3kw9|5 months ago

next time they need 1 public and 1 private router and shut the public off right before the demo.

krustyburger|5 months ago

Even if it’s small/cheap, if the item is scanned multiple times this will prevent any electrical infetterence.

303uru|5 months ago

It’s the WiFi, ya sure.

klabb3|5 months ago

Yeah I was also cringing at that cop out. It doesn’t appear connectivity related. Plus even if it was, it beautifully highlights the connectivity requirement which sucks for so many reasons.

losvedir|5 months ago

Ouch. Kudos for trying, though. I miss the days of live demos at Apple events, instead of all these polished videos of people standing in silly poses around the Apple campus.

HaZeust|5 months ago

I have mad respect to them for actually attempting this on the fly - especially a public company. Nothing really to gain versus a scripted demo, and absolutely everything to lose. Admirable.

xandrius|5 months ago

Obviously scripted, just the LLM didn't follow its part of the script.

anal_reactor|5 months ago

Hearing this AI-generated voice awakens some primal aggression in me.

klik99|5 months ago

This is why Jobs spent months prepping for each presentation.

But hey, at least it's not all faked

gretch|5 months ago

When I was at Meta (then facebook), people lived and died by the live demo creedo.

Pitches can be spun, data is cherry picked. But the proof is always in the pudding.

This is embarrassing for sure, but from the ashes of this failure we find the resolve to make the next version better.

SoftTalker|5 months ago

I saw Jobs give a demo of some NeXT technology and the system crashed and rebooted right in the middle of it. He just said “oops” and talked around it until the system came back up.

postalcoder|5 months ago

i love jobs but i do remember the “everybody please turn off your laptops” presentation.

live demonstrations are tough - i wish apple would go back to them.

neilv|5 months ago

"At least it's not faked" was my main reaction, too. Some other big-tech AI-related demos the last couple years have been caught being faked.

Zuckerberg handling it reasonably well was nice.

(Though the tone at the end of "we'll go check out what he made later" sounded dismissive. The blame-free post-mortem will include each of the personnel involved in the failure, in a series of one-on-one MMA sparring rounds. "I'm up there, launching a milestone in a trillion-dollar strategic push, and you left me @#$*&^ my @#*$&^@#( like a #@&#^@! I'll show you post-mortem!")

garbawarb|5 months ago

I appreciate the live demo but I'm suprised they didn't at least have a prerecorded backup. I wanted to see how video calls work!

jjfoooo4|5 months ago

It was painful even before it started malfunctioning

baby|5 months ago

The demo gods were not present that day

anonu|5 months ago

It was the WiFi though

herval|5 months ago

Typical Meta product. I used to believe and wasted money on multiple generations of Quest & Ray-bans. I expect this device to be unsupported at launch, just like Quest Pro was

hattmall|5 months ago

The portal was like their best product and they just abandoned it.

m3kw9|5 months ago

so when I talk but not to it, it may response like i accidentally say siri? Except is every time?

joshdavham|5 months ago

For those who didn't pick up on it, they were being sarcastic about the issue being wifi related haha

bigtones|5 months ago

That was not sarcasm. They were being serious.

dxxmxnd|5 months ago

I’m surprised everyone is saying they weren’t sarcastic. They were even being MORE sarcastic about it being the Wi-Fi after the failed WhatsApp call.

stavros|5 months ago

It didn't sound like sarcasm at all to me?