top | item 47196746

Garbage In, Garbage Out: The Degradation of Human Requirements in the LLM Era

5 points| waylake | 1 day ago

The LLM Paradox: We’re Forgetting How to Speak to Humans

The longer we use LLM services, the more I see a specific kind of "psychosis" spreading in the workplace. LLMs are so good at hallucinating a coherent answer from a vague prompt that people have started to believe their vague prompts were actually coherent.

LLMs Are Not Humans It sounds obvious, but we are losing our grip on this fact. People are beginning to treat their colleagues like a black-box LLM. They’ve forgotten that human communication requires precision, shared context, and accountability. In the pre-LLM era, "make it pop" was a phrase reserved for clueless clients. Now, it’s becoming the standard operating procedure inside engineering teams.

The "Do It Well, You Figure It Out" Fallacy I see managers—even those with engineering backgrounds—who are terrified of being held accountable for their own bad ideas. They hide behind vagueness. They use tools like Claude Code as a shield to bypass technical debt discussions.

When an engineer spends days fixing a half-baked requirement and managing technical constraints, the feedback isn't "Thank you for the due diligence." Instead, it’s: "See? It was possible after all. Why did you push back so hard? LLMs could've done it in seconds." This is gaslighting. They want the output of a senior engineer while providing the input of a garbage prompt.

The Death of Articulation LLMs accept "garbage in" and provide "plausible out." This has become a drug. People are losing the ability to articulate their own thoughts. They throw a mess of words at you and expect a miracle. If this continues, we aren't just looking at bad software; we’re looking at a breakdown of professional sanity.

I’ve felt the symptoms myself. Lately, I’ve caught myself thinking, "Explaining this to my team is a waste of 'communication cost.' I’d rather just pay for more API tokens and do it myself."

But we must remember: A high-functioning team is not a collection of prompt engineers. True teamwork is exponentially more efficient than a lone developer with an LLM. We cannot afford to lose the art of talking to each other.

5 comments

order

Nathanf22|7 hours ago

There's a downstream effect of this that I haven't seen discussed: the same degradation is happening to architecture. When requirements become vague, the systems built from them become opaque. AI-generated code is syntactically correct but architecturally invisible — nobody draws the system diagram because "the AI handled it." The vagueness in the requirement propagates into the structure of the codebase itself. Six months later, you have a system that works but that nobody can explain. Writing the next requirement becomes even harder because you've lost the mental model of what you're building on top of. Garbage in, opaque architecture out. The articulation problem you describe isn't just about communication between humans. It's eating the layer below: our ability to reason about the systems we're building.

fullstick|14 hours ago

I'm not sure that's related to llms. Corporate speak has been a thing forever. How do you say a lot without really saying anything? You use cross horizontal collaboration to capitalize on vertical integration capabilities. This allows teams to synergize fully without having to loop in unnecessary resources.

FrankWilhoit|1 day ago

None of this is new. Developers have always been given ambiguous requirements, and questions about them have always been furiously rejected. Then they guess what was meant, or should have been meant, and that is what is deployed.

didgetmaster|1 day ago

We have dealt with this in the political realm for decades. Vague word salads are being passed off as coherent policy arguments every where we turn.

After spewing some vague policy position, a politician's worst fear is that a reporter might ask "What do you mean by that?" Or "How would that work in real life?"

K-Tai|16 hours ago

Fire, so we aren't completely fucked by AI. No actually heartwarming news tbh.