top | item 21544373

(no title)

Elrac | 6 years ago

Almost never.

With the kind of software I mostly write these days, I'm fortunate to be able to incrementally develop my code and test it under real-world conditions or a subset thereof.

So my approach is exploratory coding -- I start with minimum workable implementations, make sure they work as needed, and then add more functionality, with further testing at each step.

The upside is that I don't have to write "strange" code to accommodate testing. The downside is that I'm forced to plan code growth with steps that take me from one testable partial-product to the next. A more serious downside, one I'm very aware of, is that not every project is amenable to this approach.

discuss

order

hackerm0nkey|6 years ago

> With the kind of software I mostly write these days, I'm fortunate to be able to incrementally develop my code and test it under real-world conditions or a subset thereof.

What kind of software you write if you don't mind me asking ?and are your "real-world conditions" tests automated ?

> The upside is that I don't have to write "strange" code to accommodate testing.

Can you elaborate more as what you mean by "strange" ?

Elrac|6 years ago

For the past 2 years, most of my work has been in porting some fairly simple legacy message forwarding and conversion programs from C to Java. So on our test servers I can swap out the C programs for drop-in replacements in Java and watch them (via log files) working -- or not. If my programs fail I can either observe crashes and stack trace or the message receiving programs will crash or loudly object to bad data from me. Usually one day's worth of traffic will exercise enough of my program's logic that failure to fail for a day constitutes a successful end-to-end test.

Yes, this is kid stuff. My current work is about as sophisticated as typical undergrad Computer Science projects. We can't all be doing rocket science!

I used to write automated test setups for my programs, providing streams of pre-canned messages and such. That worked out OK. I suppose it's great to have test suites to avoid regression and such, but I ended up regretting all the effort I sunk into testing. So far it's been my experience that I would sink a lot of time into creating a test suite that could exercise my programs as thoroughly as simple exposure to real-world message traffic.

I hope my attempt to be brief didn't come across as derogatory when I wrote "strange." Here's an example: I like to make a lot of my fields and methods private. It's handy that my IDE warns me when fields and methods aren't used, or when final fields aren't initialized. Obviously, for "classic" unit tests I'd have to at least expose my methods at the package level to call them from out of class. Another example: my apps rely on a fair bit of configuration data and some embarrassingly tight coupling between my classes. A JUnit-friendly program would call for a lot of mockups, as well as a lot more coding to interfaces rather than concrete classes, probably a lot more reliance on design patterns. My coding style for these projects yields a small number of compact classes but is very hostile to unit testing.

To be clear: For many other projects, your mileage may vary dramatically. I've successfully done TDD in other projects where that made a lot more sense.