I think it's meant to be taken more in the abstract. Yes, LLM can refuse your request, and yes you can ask it to prepend "have you checked that it already exists?", but it can't directly challenge your super long-range assumptions the same way as another person saying at the standup "this unrelated feature already does something similar, so maybe you can modify it to accomplish both the original goal and your goal", or they might say "we have this feature coming up, which will solve your goal". Without proper alignment you're just churning out duplicate code at a faster rate now.
No comments yet.