(no title)
Teleoflexuous | 5 months ago
You ask AI how to do something. AI generates steps to do that thing. It has concept of steps, so that when you go 'back' it goes back to the last step. As you ask how to do something, it finishes explaining general idea and goes to first step. You interrupt it. It assumes it went through the first step and won't let you go back.
The first step here was mixing some sauces. That's it. It's a dumb way to make a tool, but if I wanted to make one that will work for a demo, I'd do that. Have you ever tried any voice thing to guide you through something? Convincing Gemini that something it described didn't happen takes a direct explanation of 'X didn't happen' and doesn't work perfectly.
It still didn't work, it absolutely wasn't wi-fi issue and lmao, technology of the future in $2T company, it just doesn't seem rigged.
timmytokyo|5 months ago
Except, no. He hadn't.
Teleoflexuous|5 months ago
System started doing Step 1, believed it was over so moved to Step 2 and when was asked to go back, kept going back to step 2.
Step 1 being Step 0 and Step 1 combined also works.
Again, it's also a weird way to prerecord. If you're prerecording, you're prerecording all steps and practicing with them prerecorded. I can't imagine anyone to be able to go through a single rehearsal with prerecorded audio to not figure out how to do this, we have the technology.
rsynnott|5 months ago
Barbing|5 months ago