top | item 46545393

(no title)

wrl | 1 month ago

> Ideally the alert would only happen if the comment seemed important but it would readily discard short or nonsensical input. That is really difficult to do in traditional software but is something an LLM could do with low effort.

I read this post yesterday and this specific example kept coming back to me because something about it just didn't sit right. And I finally figured it out: Glancing at the alert box (or the browser-provided "do you want to navigate away from this page" modal) and considering the text that I had entered takes... less than 5 seconds.

Sure, 5 seconds here and there adds up over the course of a day, but I really feel like this example is grasping at straws.

discuss

order

FridgeSeal|1 month ago

It’s also trivially solvable with idk, a length check, or any number of other things which don’t need to 100b parameters to calculate.

zdragnar|1 month ago

This was a problem at my last job. Boss kept suggesting shoving AI into features, and I kept pointing out we could make the features better with less effort using simple heuristics in a few lines of code, and skip adding AI altogether.

So much of it nowadays is like the blockchain craze, trying to use it as a solution for every problem until it sticks.

9rx|1 month ago

The problem isn't so much the five seconds, it is the muscle memory. You become accustomed to blindly hitting "Yes" every time you've accidentally typed something into the text box, and then that time when you actually put a lot of effort into something... Boom. Its gone. I have been bitten before. Something like the parent described would be a huge improvement.

Granted, it seems the even better UX is to save what the user inputs and let them recover if they lost something important. That would also help for other things, like crashes, which have also burned me in the past. But tradeoffs, as always.

addaon|1 month ago

> You become accustomed to blindly hitting "Yes" every time you've accidentally typed something into the text box, and then that time when you actually put a lot of effort into something... Boom. Its gone.

Wouldn't you just hit undo? Yeah, it's a bit obnoxious that Chrome for example uses cmd-shift-T to undo in this case instead of the application-wide undo stack, but I feel like the focus for improving software resilience to user error should continue to be on increasing the power of the undo stack (like it's been for more than 30 years so far), not trying to optimize what gets put in the undo stack in the first place.

fckgw|1 month ago

Which is fine! That's me making the explicit choice that yes, I want to close this box and yes, I want to lose this data. I don't need an AI evaluating how important it thinks I am and second guessing my judgement call.

I tell the computer what to do, not the other way around.

officeplant|1 month ago

>You become accustomed to blindly hitting "Yes" every time you've accidentally typed something into the text box, and then that time when you actually put a lot of effort into something... Boom. Its gone.

I'm not sure we need even local AI's reading everything we do for what amounts to a skill issue.

pavel_lishin|1 month ago

I have the exact opposite muscle memory.

th0ma5|1 month ago

I think this is covered in the Bainbridge automation paper https://en.wikipedia.org/wiki/Ironies_of_Automation ... When the user doesn't have practiced context like you described, to be expected to suddenly have that practiced context to do the right thing in a surprise moment is untenable.

johnnyanmac|1 month ago

A rarer-ish chance to use this XKCD: https://xkcd.com/1205/

I'd put this in "save 5 seconds daily" to be generous. Remember that this is time saved over 5 years.