There's a downstream effect of this that I haven't seen discussed: the same degradation is happening to architecture.
When requirements become vague, the systems built from them become opaque. AI-generated code is syntactically correct but architecturally invisible — nobody draws the system diagram because "the AI handled it." The vagueness in the requirement propagates into the structure of the codebase itself.
Six months later, you have a system that works but that nobody can explain. Writing the next requirement becomes even harder because you've lost the mental model of what you're building on top of. Garbage in, opaque architecture out.
The articulation problem you describe isn't just about communication between humans. It's eating the layer below: our ability to reason about the systems we're building.
No comments yet.