top | item 47205275

(no title)

entrustai | 20 hours ago

The 747 analogy cuts deeper than skill atrophy. The pilot's problem isn't just that he stopped improving — it's that the feedback loop between action and consequence was severed. When the plane does the flying, you stop building the intuition that tells you when something is subtly wrong before it becomes catastrophically wrong.

That's the real risk with coding agents, and it's not about prompting skill or code review habits. It's about the degradation of the anomaly-detection faculty — the part of an experienced engineer's brain that notices "this doesn't feel right" before understanding why.

The pilot analogy also suggests the failure mode: not gradual incompetence, but sudden catastrophic incompetence. Pilots who over-rely on automation perform fine until the automation fails — then they're disoriented in a situation their skills haven't been trained to handle. Air France 447 is the canonical example.

The coding equivalent isn't a developer who writes bad code. It's a developer who can't diagnose what went wrong when the agent produces something plausible but subtly broken in a domain they no longer understand deeply enough to interrogate.

The "write code by hand as an educational task" suggestion is right but probably underestimates the discipline required. It's hard to choose the slower path when the faster one is immediately available.

discuss

order

No comments yet.