top | item 30945198

(no title)

tmstieff | 3 years ago

The article actually argues the opposite. Developers should move their focus to integration / "real-world" tests. The major summary bullet point being:

"Aim for the highest level of integration while maintaining reasonable speed and cost"

My experience mirrors the author's. In any "real" business application, the unit tests end up mocking so many dependencies that changes become a chore, in many cases causing colleagues to skip certain obvious refactors because the thought of updating 300 unit tests is out of the question. I've found much better success testing at the integration level. And to be clear this means writing a tests inside the same project that run against a database. They should run as part of your build, both locally and in CI. The holy grail is probably writing all your business logic inside pure functions, and then unit testing those, while integration testing the outer layers for happy and error paths. But good luck trying to get your coworkers to think in pure functions.

discuss

order

lytefm|3 years ago

> The holy grail is probably writing all your business logic inside pure functions, and then unit testing those, while integration testing the outer layers for happy and error paths. But good luck trying to get your coworkers to think in pure functions.

I've come to a similar conclusion. Functions don't necessarily have to be pure in the academical sense, though - but I feel like the more the business logic is decoupled from dependency injection and the less it is relying on some framework, the better.

It makes testing a lot easier, but also code reuse. I've just been writing a one-off migration script where I could simply plug in parts of the core business logic. It would have been very annoying if that was relying on Angular, NestJS or whatever.

zebraflask|3 years ago

I've had the same experience. Suboptimal code isn't refactored because of the test code overhead, or, much worse, the tests on that same subpar code somehow morph into a perceived "gold standard" for how that code should work.

I avoid tests (aside from hands-on end user testing) as much as possible, actually, since they rarely seem to tell you anything you'd didn't already know.

rectang|3 years ago

> in many cases causing colleagues to skip certain obvious refactors because the thought of updating 300 unit tests is out of the question.

Good! They shouldn't do the refactor.

Because "obvious" refactors often introduce bugs (e.g. copy/paste errors), and if developers can't be bothered to write tests to catch them, they're going to screw over the other team members and users who will be forced to deal with their bugs in production.

> The holy grail is probably writing all your business logic inside pure functions, and then unit testing those, while integration testing the outer layers for happy and error paths.

So settle for half a loaf.

Write all the easy unit tests first. The coverage will be very incomplete, but something is better than nothing.

Write all the easy integration tests next.

Never write the hard tests if you can help it.

mb7733|3 years ago

> Good! They shouldn't do the refactor.

> Because "obvious" refactors often introduce bugs (e.g. copy/paste errors), and if developers can't be bothered to write tests to catch them, they're going to screw over the other team members and users who will be forced to deal with their bugs in production.

In my opinion, useful tests should be able to survive a refactor. That is the only sane way I've ever done refactoring.

If I'm doing a large refactor on a project and there are no tests, or if the tests will not pass after the refactor, the first thing I do is write tests at a level that will pass both before and after refactoring.

Rewriting tests during refactoring doesn't protect from regression on my experience.

pydry|3 years ago

>Good! They shouldn't do the refactor. Because "obvious" refactors often introduce bugs.

& if your tests arent catching those bugs and require extra maintenance to go green again you are doing them wrong.

lucasyvas|3 years ago

I understand what the article is arguing. I agree with it, but think it's idealistic. If swaths of your code are a mess, integration testing is super painful. You can't easily add it until you clean up the mess, so the other forms of testing are more practical more often in my experience. If you get to a point where your code isn't a mess, I'd agree that you should start introducing meaningful integration tests.

I think this is just one of those cases where there is a context-sensitive strategy to testing. It depends completely on the cleanliness of your code and experience working with it.

oxff|3 years ago

Trying to write your code such that it can work with arbitrary data and arbitrary amounts of it is a step towards the holy grail I think.

Then you get to use fuzzer & Arbitrary for basically what is a coverage guided property based test.

But it's hard to maintain that idea at all times when you are writing.