Most solutions on their own don't improve things a whole lot. Yet, in a system of supporting practices, it can be very powerful. The primary thing is you need a system, not just the individual parts. Testing without changing the design of your code is a horrible experience. Applying techniques like dependency inversion/injection has a positive effect on isolating behaviour which makes testing easier. Making code more deterministic makes tests easier. Pushing out side-effect from your core logic makes testing easier. All of those things add up to more than the sum of it's part, which is the an indication of dealing with a system.
qsort|2 years ago
My main gripe with it is the second-order concern that it encourages testing practices that are frankly not very intelligent.
How you should approach testing depends on what kind of function you are testing. Pure functions of their inputs should be tested with property-based tests. If you have
the whole rigmarole of "make a test that fails, then write the least amount of code that makes it pass" brings you to something like: and so on, where what you really want is to say something like: This obviously works less well when you have to deal with the real world, but even in that case TDD leaves you with a patchy and inflexible approach.drewcoo|2 years ago
https://en.wikipedia.org/wiki/Equivalence_partitioning
MoreQARespect|2 years ago
In those cases dependency injection just increases the SLOC with the payoff that you are able a bunch of trivial unit tests that'll probably never catch a bug.
Integration tests as a default have a the best ROI in those cases.
frankdejonge|2 years ago
Another angle is the encapsulation of storage. Using in-memory storage for tests makes Ci pipelines very quick and production storage might evolve over time to accommodate scaling requirements (sharding and such).
Tainnor|2 years ago
I like dependency injection for things which have state and/or do IO and/or are expensive to construct.
taberiand|2 years ago