top | item 46575592

(no title)

drooby | 1 month ago

It's been about a decade years since many thought full-self-driving cars were "just a couple years away".

Reality is that FSD was/is a "few decades away"

Same for programming. We can take our hands off the steering wheel for longer stretches of time, this is true, but if you have production apps with real users that spend real money then going to sleep at the wheel is far too risky.

Programmers will become the guardians and sentinels of the codebase, and their programming knowledge and debugging skills will still be necessary when the AI corners itself into thorny situations, or is unable to properly test the product.

The profession is changing, no doubt about it. But its obsolescence is probably decades away.

discuss

order

savorypiano|1 month ago

The last 1% in FSD is an asymptotic challenge. The last 1% in a CRUD app codebase just requires a few engineers.

hahahahhaah|1 month ago

The last 1% of FSD needs a teenager with 30 hours driving experience. (using the premise of your second sentence applied to the first)

fnoef|1 month ago

Self-driving cars are a bad example, because we are talking about a heavily regulated industries, with fatal consequences of malpractice, and a tool (the car) that is not easily available to the average person. I'm pretty sure that is the cost of cars would be comparable to what a software engineer pays for claude code, government would relax on the laws, and as a society we would accept a few (tens of) thousands of casualties, self-driving cars would be already here.

You talk about programming that become guardians, but I see two issues with this: (1) you don't need ten guardians, you need 1-2 that know your codebase; and (2) a "guardian" is someone who were junior, turned into senior, if juniors are no longer needed, in X years there will be no guardians to replace the existing ones.

arter45|1 month ago

Yes, it is an extreme example, but if your application(s) makes your company millions of dollars or euros, even if you are in a business that is not heavily regulated [1], mistakes or unavailability can cost a lot of money. Even if your company is not that big, mistakes in a crucial application everyone uses can cost time, money, even expose the company to legal trouble. "Self driving" coding in these situations is not ideal.

[1] Even if your domain is not traditionally considered heavily regulated (military, banking,...) there is a surprising amount of "soft law" and "hard law" in everything from privacy to accounting and much more.

gniv|1 month ago

A lot of the software produced in big corps is mission-critical. Self-driving cars are an extreme example but I think the same principle applies to banking, infrastructure, even things like maps, since they are used by billions.

andrei_says_|1 month ago

A few thoughts:

Someone smart recently wrote that the key factor is becoming who is responsible for the functionality, not who wrote the code. Who guarantees correctness and takes responsibility when shtf?

I’d add - especially when the codebases are becoming unknowable because of complexity and speed of code generation.

And the code is generated by entities that do not have a distinction of correctness, or reality.

Essentially emergent genies which we know are blind to the world but very capable of putting together well-sounding sentences.

al_borland|1 month ago

Software in the transportation industry, medical field, and elsewhere is quite literally life and death.