top | item 47183253

(no title)

ctoth | 2 days ago

I'm sorry, what?

> Previously, you were able to have _some_ reasonable expectation of security in that trained engineers were the ones building these things

When was this? What world? Did I skip worldlines? Is this a new Universe?

The world I remember is that anybody could write a program and put it on the Internet. Is this not the world you remember?

Further, when those engineers were "trained" ... were there no data breaches before 2022?

discuss

order

carlgreene|2 days ago

Of course there were. Don't be pedantic. Anybody could write a program and put it on the internet. But to get a reasonably polished version with decent features and an enjoyable enough UX for someone to sign up and even pay money more, it generally took people who kind of knew what they were doing.

Of course shortcuts were taken. They always were and always will be. But don't try to compare shipping software today to even just 3 years ago.

kimixa|2 days ago

Yes - AI has completely destroyed the set of "Signals" people used to judge quality of much software. They weren't ever 100% accurate, sure, but they were often pretty good heuristics for "level of care", what the devs considered important (or didn't consider important) and similar.

And I mean that as both "end user" software signals, and "library" signals for other devs.

I assume that set of signals will slowly be updated. If one of those ends up being "Any Use of AI At All" is still an open question, depending on if the promised hype actually ends up meeting capability as much as anything.