top | item 47206353

(no title)

entrustai | 14 hours ago

The pattern you've identified has a precise mechanism that's rarely named: each abstraction layer doesn't eliminate complexity; it relocates it. COBOL moved complexity from machine instructions to business logic specification. 4GLs moved it from code to data modeling. No-code moves it from programming to workflow configuration. LLMs are moving it from syntax to prompt engineering and output verification.

The relocation is genuinely valuable — each move makes the simple cases dramatically simpler. But the complexity doesn't disappear. It accumulates at the new boundary, which is why each wave creates a new class of specialists rather than eliminating specialists.

What's underappreciated about the current wave is where the complexity is relocating to. With LLMs generating code, the hard problem is no longer writing correct syntax — it's verifying that the generated output is correct, secure, and maintainable. That verification problem is arguably harder than the original coding problem, because you're now auditing code you didn't write, in a codebase shaped by decisions you didn't make, produced by a system whose reasoning you can't inspect.

The irony is that LLMs may be creating demand for a skill that programming culture has historically undervalued: careful, systematic verification of code you didn't write. That's closer to auditing than engineering. And it turns out auditing is hard.

discuss

order

NBJack|9 hours ago

It has been a while, but I remember a project of mine trying to port a FTP client to a 'secure compiler' (this was long before Rust and probably a distant ancestor of Checked C). In theory, if I could successfully port it, it would be much more resilient to particular kinds of issues (and maybe even attacks). This was in the era where formal proof coding was trying to take off as well in the industry.

After wading through an impressive number of compiler errors (again, it was technically compatible) and attempts to fix them, I eventually surrendered and acknowledged that at the very least, this was beyond my abilities.

I probably would had gotten much further just rewriting it from scratch.

entrustai|6 hours ago

Your experience is the theorem, not just an anecdote.

The secure compiler relocated complexity into the porting process — and that relocated complexity turned out to be harder than the original problem. Rewriting from scratch would have been cheaper because you'd be working with complexity you generated and understood, rather than auditing complexity the tool imposed.

This is exactly the dynamic playing out now with LLM-generated code at scale. The syntax is free. The verification is expensive. And the verification is harder precisely because the complexity isn't yours — it arrived from a system whose reasoning you can't inspect and whose decisions you didn't make.

What you ran into in that FTP port is what every engineering team using LLMs for production code will eventually run into. You just got there thirty years early.