top | item 45473126

It's not a hack to satisfy known requirements

52 points| michalc | 4 months ago |charemza.name

58 comments

order

jfengel|4 months ago

I find that strong typing often obviates the need for unit tests.

Software breaks when data transforms in a way that typing can't solve. When data goes across a wire, or into a database, it leaves your space. Anything you do to your code risks breaking it. Integration tests solve that, but at a very high cost.

I don't have a great solution for that. It just comes down to experience: how do things change over time? You take guesses. You try to be flexible, but not so flexible that you aren't solving the problem at hand. (It doesn't do you any good to hand the user a C compiler and say "this is flexible enough to handle all of your future needs.")

Experience is, unfortunately, the worst teacher. It gives the lesson after it gives the test.

crazygringo|4 months ago

> I find that strong typing often obviates the need for unit tests.

Can you expand? Because my experience is they are totally orthogonal.

For me, unit testing is to ensure the function's algorithm is correct. You verify add(2, 3) == 5 and add(1, 2, 3) == 6 and add(2, Null) == Null.

But you don't generally write unit tests that tests how a function behaves when you pass an unexpected type. Nobody in my experience is testing add("a", FooObject) for a function only meant to take ints or floats, to make sure it only takes ints or floats.

So they solve entirely different problems: strong typing ensures a caller provides compatible data, while unit tests ensure a callee produces correct results (not just correctly typed results) from that data. You want both, ideally.

the_af|4 months ago

Agreed about strong typing being a valuable tool, especially static typing.

We've come full circle with coworkers telling me that "the best thing" about LLMs is that they can tell you when you have a typo in your function invocation or you forgot a mandatory parameter. This always leaves me dumbfounded, mouth open. If only we had a system to prevent this, from before LLMs!

prerok|4 months ago

Strong typing does not obviate the need for unit tests. It just obviates the need for the simplest ones about passing incorrect types. These are now being obviated anyway due to typecheck static analyses being added to most commonly used untyped languages (python and typescript over javascript, for example).

fn-mote|4 months ago

> strong typing often obviates the need for unit tests

Do languages like Java have strong typing?

I thought so, but I can’t reconcile that with the belief that unit tests in Java would be unnecessary.

RHSeeger|4 months ago

> I find that strong typing often obviates the need for unit tests.

There are many ways that software can fail, and unit tests cover some of them. They don't remove the need for unit tests at all, but they do reduce the number of them needed (because you no longer need to test the things that strong typing handles).

eptcyka|4 months ago

I usually test behavior, not interfaces or implementation details.

inerte|4 months ago

I have an engineer on my team that's always asking "what if this or that happens in the future?" to which I've started to reply "what if it does NOT?"

I know, I know... wow. Not much insightful. But for some reason with this particular engineer this is the starting point to talk about actual requirements. This question in particular triggers the conversation of going back to product to figure out what they truly know they want right now, not in a maybe future. What are the actual hard, known requirements, instead of wishful thinking and "if everything goes well we will need this" type of mentality of ~hopeful~ optimistic PMs.

palata|4 months ago

I find that these discussions happen in teams that lack experience.

It's common for junior engineers to want to over-engineer stuff: they want to pull this cool library, they want to try this nice pattern, and over all they want to make a good job and a complex architecture sounds like they put more effort into it than a two-liners. That's why junior engineers are not the team lead.

As the lead, many times it's difficult to prove why it's over-engineering. You can often only say "hmm what you suggest is pretty complicated, requires a lot of effort, and in this case I don't think it's worth it".

What makes your take more valuable than the junior engineer's take? Experience.

Now don't get me wrong: it does not mean AT ALL that juniors don't bring anything valuable. They often bring great ideas. But their lack of experience means that sometimes it's harder for them to understand why it's "too much". The lead should listen to them, understand what they say (and by that I mean that they should prove to the junior that they understand), and then refuse the idea.

If a junior feels like the lead is incompetent (i.e. does not understand their ideas and selects inferior solutions instead), then the team is in trouble. And in a way, it is the lead's responsibility.

Otek|4 months ago

The question “what if it happens” is important but useless without “how likely is that to happen” and “if it will happen how much time we need to cover for it”

add-sub-mul-div|4 months ago

It's very insightful to realize that the space of things that will go wrong or will be important does not overlap very well with the contemporary zeitgeist of warnings coming from "best practices" and industry "thought leadership." It took me a while to get there. It's just so easy to point to some blog post that's become popular and use it to support your point when the person who wrote it knows nothing about your situation.

lwhi|4 months ago

The engineer is wise to ask this.

Architectural decisions sometimes close doors and making future changes very difficult.

The only thing we know for certain is that change will happen.

wredcoll|4 months ago

It's all pretty solid except for the part about OO.

Inheritance almost never works in "the real world" but I find being able to tie functions to the data they're expected to work on to be pretty helpful.

It's sort of like typing, really, functionX can only take FooBar variables vs making methodX on class FooBar.

Like everything else you can "do it wrong" and you shouldn't be a slave to any particular software ideology.

parpfish|4 months ago

i've been beating the drum for a long time that we teach OO programming wrong.

we always start with inheritance (Car is subtype of Vehicle; Cat is subtype of Animal).

we need to teach encapsulation as the primary use for OO.

ime, the most effective way of using "OO" in practice is that you define data classes for different entities and then affix a few fancy constructors that let you build entities out of other entities. inheritance rarely gets used.

makeitdouble|4 months ago

But then the function x data mapping isn't 1 to 1 in most cases, which is often why inheritance is used.

IMHO sperating data formats and functions works decently enough, interface/protocol/duck typing are more elegant than OO classes.

As a real world image, a barcode scanner could be applied to anything that has a barcode, regardless of what that thing is. And I'd wager 99% of what we're trying to do fits that mold. When authentifying a user, the things that matter will be wether it's a legitimate call, and whether the user is valid. Forcing that logic I to classes or filtering by use type quicky becomes noise IMHO.

1dom|4 months ago

I came to the comments to try find a similar sentiment. I agree wholeheartedly with the author on everything apart from the bit about OO, where I feel the same as you.

What's the deal with this? I'm not an OO evangelist at all, but I often find myself using objects like you describe: as a mechanism to group related functions and data.

I feel there are people who see OO like a philosophy on how to architect stuff, and from that perspective, the idea of a "purely OO system" is perhaps a little unwieldy.

But from the perspective of OO as a low level tool to help group stuff in programming, as part of some other non-pure-OO system - it works really well and makes a lot of sense for me. I've often done this in environments around people who are outspoken anti-OO who either haven't noticed or haven't complained.

Am I a bad person, are you like me, are we idiots somehow?

palata|4 months ago

I feel like people who say "OO is never right" don't understand how to use it properly. Applies to other concepts as well, of course.

There are tools, and we as professional are expected to use them when they make sense. That's all. If you use a tool badly, don't blame the tool.

RHSeeger|4 months ago

> Inheritance almost never works in "the real world"

Inheritance works just fine in the real world. It's just not the only tool in the box, and many times other tools work better. But, especially when limited to shallow hierarchies, it's very useful.

AndrewKemendo|4 months ago

> We're not here to write code, but to solve problems.

In my opinion having had multiple technical job roles (car stereo/alarm installer, website builder, military officer, CEO, CTO etc…) this is always the job.

The job is *always* to make the organization more effective and efficient full stop. Your role in that is what you choose and negotiate with your team throughout your life; boundaries change pre/during/post employment.

When you join a company you usually (not always) have a niche role, to fill in a gap that is preventing effective organizational execution.

If you’re mentally flexible to understand that your narrow focus is not the actual output, that is a temporal means to an output, then you transform how you view the concept of work and relationships

wouldbecouldbe|4 months ago

In most companies your job is to solve tickets. And the only creative freedom a developer has is how that ticket is solved. Quick fix, properly done, rework etc. Maybe that’s why certain smart developers over engineer, because that’s the only place they can create ownership

palata|4 months ago

I am amazed by the number of articles like this, that essentially say "you should not write bad code, you should write good code", while somehow implying "listen to me, I know better" (otherwise I wouldn't write the article...).

The truth is that writing good code takes experience. Those who live by the rule "thou shalt not over-engineer" risk writing bad code. Those who live by the rule "thou shalt know all the patterns and use them" risk writing bad code.

You should strive to write code that others can understand and maintain, period. If you need to justify your lack of "something" ("It's not a hack because..." or "I don't use OOP because..." or "I duplicated this code because..."), then it feels like it says something about your opinion of your own code, IMHO.

motorest|4 months ago

> . If you need to justify your lack of "something" ("It's not a hack because..." or "I don't use OOP because..." or "I duplicated this code because..."), then it feels like it says something about your opinion of your own code, IMHO.

I feel this sort of opinion is simplistic. "Explaining" is a need that is sparked by both sides. Just because someone is having doubts or questioning your work that doesn't mean they are automatically right and you are automatically bounded to introduce changes. Sometimes you do get questions from people who don't even have context on the problem domain and why you are taking path A instead of path B.

Also, sometimes your choices can be questioned by opinionated peers who feel compelled to bikeshed over vague and subjective styles instead of objective technical issues. Is this something that should cause churn in your PRs? To give an example, once I had the displeasure of working with an opinionated junior developer who felt compelled to flag literally white spaces as critical problems in a PR because said junior developer instead of onboarding a source code formatter decided to write a personal markdown file with their opinions on style, and was trying to somehow force that as a reference. Is this sort of demand for justifications something you think should be accommodated?

prerok|4 months ago

Could not agree more.

It's perfectly ok to not use a software pattern if it's not useful. It's ok to duplicate code if you know it will likely diverge in the future. Small and simple is the way.

OutOfHere|4 months ago

Yes. One either develops systems that work well and scale reasonably, or brittle ones without basic foresight, that keep failing and keep bad engineers employed. Especially if one is putting out open source software, one should take the time to engineer them well, not under-engineer them.

crazygringo|4 months ago

> One of the worst pieces of advice that I ever received was that every function should be unit tested.

Obviously not every function should be -- many are so obvious and straightforward that there's nothing to test -- but every function that does anything vaguely "algorithmic" should be. Unit testing is really important for catching logic errors.

> Instead, write higher level tests close to the client/user facing behaviour that actually give you protection against breaking things unintentionally

Yes, these are good. But they're a different kind of test. There are tests for correctness, and tests that the program runs. You need both.

In fact, sometimes you even need to split up functions smaller than they otherwise would be, just so you can test an inner logic portion independently.

RHSeeger|4 months ago

> Yes, these are good. But they're a different kind of test

I've had this exact same discussion with people before. The same people that say "unit tests are worthless because the implementation could change, then the test gets thrown away". Honestly, it drives me bonkers because that entire argument makes no sense to me.

1dom|4 months ago

> Yes, these are good. But they're a different kind of test. There are tests for correctness, and tests that the program runs. You need both.

Why do you need both?

Some software is so small and simple that it's possible to write a high level running/integration test that covers all the practical correctness tests that application might need too.

You can say "yeah, but they'd be better if they had unit test" but that's the point being made: eventually you reach a place where more tests, even those recommended as best practice, don't actually deliver any more _real world_ value, and even make the code harder and slower to maintain.

parpfish|4 months ago

This piece of advice:

> Remember you can still add in that complication tomorrow

is directly undermiend later with:

> When should you create stuff just in case? > ... > 1. There is a reasonable chance it will be useful later > 2. It will be difficult to add in later > 3. It won't meanginfully slow down the meeting of more likely requirements

whenever i've pushed to overengineer its because i've developed a strung hunch that points 1 and 2 are true and i'm being defensive about my time and effort next week.

and if you're not allowed to push back because of 1 and 2 it's a sign of some sort of organizational problems where product folks sit at the top of the hierarchy and hand down dictates to builders without consulting with builders as equal partners.

daxfohl|4 months ago

Its easy to mix up "what seems easier to work with" vs "what's actually easier to work with". Pulling things out to config, splitting into microservices, additional layer of abstraction, rules engines, etc., they seem like they'll be easier to work with at a high level because it gets logic out of the core, but then when you're actually working with them, or worse, when someone else is working with them and doesn't have your context, now there are five places to go look for logic and deciding which piece needs changed, instead of just one obvious place.

philippta|4 months ago

> Remember that code that clearly solves just those problems is in no way a hack.

I think you can’t stress this point enough. In my experience anything that is not implemented by any norm or „clean“ or in an unusual way is considered a hack. Even if it perfectly solves the problem with the least amount of cruft to it. That makes me sad.

Waterluvian|4 months ago

I wish more engineers I worked with had a stronger personal belief that design and planning is a favour they do for themselves. Defining clear requirements and resolving unknowns (or at least identifying them) is the foundation that if you don’t build, you’ll be building your project many times.

mouse_|4 months ago

I think I needed this right now, thank you.

michalc|4 months ago

You’re very welcome!

Have to admit I am curious: what’s the context / how has it helped you more specifically?

raincole|4 months ago

> Avoid Object-Oriented Programming

Yeah, no. Every time I saw code written by someone who attempted to avoid OOP it ended up with passing a huge 'context' parameter to most functions, effectively reinventing Python's OOP but worse.

Use pure functions as the starting point, but when you find yourself start passing complex structure around (any abstract word in parameter names, like 'context', 'data', 'fields' is a sign for that) just use OOP.

chamomeal|4 months ago

I think of context parameters being a replacement for dependency injection. What parts of OOP are replaced by context params? State?

mattv8|4 months ago

This is a great article, and I largely agree but I feel like we're giving ourselves an excuse to be lazy. Because I've absolutely seen this principle swing in the opposite direction, where someone writes slop code, without ever having had a real conversation with the end user(s) and consequently the software goes out the door without having considered top 5 most common edge cases that would have been so obvious if a little more effort had been put in.

add-sub-mul-div|4 months ago

It's easier to fix underengineering than overengineering so I still err on the side of the former.

michalc|4 months ago

Have to admit the lazy thing threw me, but I can see how the “doing less” I’m arguing for could be taken that way. The “less” is not about avoiding handling edge cases that are possible now, but about avoiding putting in layers of code to handle cases possible only in some future versions of the code (with some limited exceptions that I mention at the bottom of the post)

In fact, it’s crossing my mind that people might not want to be accused of being lazy, and that is a motivation to over-engineer solutions.