(no title)
arclyte | 10 years ago
Agile only makes complete sense when you take it in its historical context as a reaction to other methods, not as a standalone methodology with no history that sprang wholesale from the ground. It's not "this sounds good, let's try this", it's "we tried the other way and it did not work, we need to do something different".
Saying that you have something that works (an MVP, for instance) is _more important than_ documentation. That doesn't mean you shouldn't have documentation. Documentation will always lag behind actual code unless someone is full time updating documentation, which I have yet to see happen anywhere.
And even if you have documentation - it is an agreed communication of intent. The problem is, not everyone is necessarily reading it in the same way, or communicating their intent very well, or realizing inconsistencies and hurdles in what they've communicated. So when you actually have to write the software - you find you can't do it the way it was proposed, the way it was proposed is less efficient or fault tolerant, conflicts with other parts of the design, or is just ambiguous in a way that causes the developer to code up something other than was intended.
In a waterfall design, you wouldn't find out about the ambiguous bits until you present it to the client - at the end of the cycle when it's quite possibly already too late. In an iterative development cycle you can get constant feedback and more easily react to change. That's what agile is really all about.
Change is inevitable - agile attempts to recognize this fact and work as quickly as possible to create a product and immediately and incrementally improve upon it, keeping things moving forward, rather than spending huge amounts of time fussing over details that maybe won't make it into the final product while missing huge discrepancies or 'unknown unknowns' that aren't well understood until something is actually written.
arclyte|10 years ago
http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/2012001...