(no title)
drharris | 12 years ago
1) We inherited this monolithic spaghetti mess of a legacy system with a class hierarchy that does not lend itself to testing without a major rewrite of the codebase.
2) Online tutorials expertly teach you how to test methods like add(x, y) and things associated with the 5-minute blog tutorial they also have, but fail miserably at teaching you how to test code that actually might exist in the real world.
genericsteele|12 years ago
1) Inheriting someone else's bad code and habits is a huge reason to throw testing out the window. It's really frustrating and always comes with a "We'll write tests in the future."
2) This goes in line with another thing I've been finding. It's super easy to show why you should test, but It's much harder to actually show how to test in the real world. These tutorials show the simplest way to write a test, and it hurts those trying to learn.
drharris|12 years ago
Specifically, my software deals with hardware devices. Do I simulate those devices in code (and if so, do I need tests to test my device simulator)? Or do I somehow gather many MB of data and keep it stored somehow for testing? I'm thinking these are simple questions for a testing veteran, but nobody I work with is that. And getting permission to spend time learning is not easy in a bad economy. :)