top | item 28706393

Are software engineering “best practices” just developer preferences?

317 points| floverfelt | 4 years ago |floverfelt.org

354 comments

order
[+] cwp|4 years ago|reply
Eh, kinda. Calling something a "best practice" is basically an appeal to authority. It means, "this is the right way to do things, for reasons I don't have time to explain." There are times when that's appropriate.

But really, "best" and "right" are highly situational. Any rule of thumb, even the most basic and uncontroversial, has a situation where it doesn't apply. I was part of a discussion on a mailing list years ago, where somebody got flamed harshly when he mentioned using a global variable. He then explained that he was working on an embedded control system for cars and all the variables in that system were global. The team was well aware of the pitfalls of global state and used a combination of documentation, static analysis, custom tooling and elaborate testing to mitigate them. It was a considered design choice that made it possible to develop high-performance, hard-realtime software that could run on very limited hardware.

[+] theptip|4 years ago|reply
One thought to add here - when you appeal to an authority, which one is it?

In the OP example, it seems the senior dev is saying “on my authority”. And sometimes that is enough, especially if the senior dev can give examples of when not following this practice bit them.

But sometimes there is a higher authority, such as “it’s what is recommended in Google’s SRE book”, which is probably good advice if you are building a large SRE team. (Though as you say, not in all situations.)

I think in the worst case, “best practice” can indeed be used to shut down discussions of a leader’s preferences. But you can smell that out by asking for concrete examples, and asking how widely the practice is recommended.

A good “best practice” should be justifiable and explainable.

All that said, sometimes as the senior engineer you need to go with gut feel; “this design smells like it will give us trouble within a year” is the sort of thing I sometimes say. But I think it’s important to be honest that it’s a hunch, not a certainty in these cases, and discuss/weight accordingly.

[+] SkipperCat|4 years ago|reply
Sometimes "best practice" is not because we need to worship a standard, but we need to have a standard. If everyone does everything in a different manner, you wind up with a tower of babel and support becomes impossible.

I'm of the mindset where an organization needs to agree to specific principles that everyone adheres. Not dogmatically but pragmatically.

Having that shared "best practice" makes is easier to support other people code/system, allows people to get up to speed faster and as a bonus, allows you to make a blog post on Medium where you can call yourself a thought leader.

[+] Jorengarenar|4 years ago|reply
> somebody got flamed harshly when he mentioned using a global variable.

Harsh bashing on global variables is such a dumb thing.

Yes, they can be dangerous. Yes, many had problems due to using them. Yes, we should tell beginners to avoid globals.

But there is no reason to ban them altogether. Experienced programmers should utilize them whenever it makes sense (instead of passing down a value of a local one to almost every function [sic]).

[+] atoav|4 years ago|reply
It is best practise to use RCCBs, because it turned out faulty wiring can kill people. But in Server rooms where you might not want to switch off the whole rack without warning when one device is faulty, you can use a device to monitor the residual current (RCM). Which issues a warning first, and only switches of when the residual current raises over the acceptable level. Different scenario, different best practise. (This is also the reason medical equipment is expensive).

I think a professional should be aware why a best practise exists and how to deal with a situation where for some reason it cannot be applied as you showed with the embedded example.

Don't forget however that many devs are against best practises out of lazyness or because they don't understand the reasons why they are best pracises.

[+] asdff|4 years ago|reply
Identifying a best practice is actually not as hard as people realize imo (although it may take some time), and only becomes hard when you let emotion and sources of emotion come into what should be a rational decision making process (such as preference for a certain tooling for reasons like familiarity or popularity in the field today rather than outright advantages vs other tooling).

To identify the best practice for anything, you start by doing a review of all the available practices in the field for a given problem you are working on. Then once you've reviewed the literature you can work out the pros, cons, caveats of each of these tools, and how these considerations affect your particular use case and your expected results. Then after doing that, the best practice out of available options will be readily apparent, or at the very least strongly justified, not by an appeal to authority or popularity or familiarity, but by looking at what the underlying technology actually does and how its relevant or not to your particular task at hand. In the end you will find the very best hammer available out of all the hammers people have made in this field for your particular unique nail.

[+] dragonwriter|4 years ago|reply
> Calling something a "best practice" is basically an appeal to authority.

If presented on its own, but then, any conclusion presented on its own without supporting context and analysis is the same.

> But really, "best" and "right" are highly situational

A description of a best practice that doesn't provide a sufficiently precise description of the situation to which it applies as a best practice is generally inappropriate, unless it is the conclusion of an analysis of applicable best practices to a certain situation, in which case the scope is specified in framing the analysis.

It is true that lots of things described as best practices for particular situations with supporting rationale and up getting detached from their logic and context and becoming cargo cult best practices.

[+] swixmix|4 years ago|reply
Reminds me of US Generally Accepted Accounting Principles (GAAP). I don't need to be perfect, just consistently good enough.
[+] poulsbohemian|4 years ago|reply
The biggest issue I saw with "best practices" in my career is the failure to take into account who it claiming it to be a best practice, and in what context. I saw too many junior developers read a rando blog article, then get a non-technical / semi-technical manager excited about something that made their life easier, even though it was by no means a good practice for the context at hand. Or alternatively, believe some vendor carte blanche when they tell you their product somehow follows a best practice.

The overarching problem is that yes, there is software engineering going on in the world, but most organizations are not willing to do engineering. I don't blame the technical staff - they often have good intentions - but rather the typical business is not willing to pay the cost in time or money to do long-lasting engineering practices. This is one of the things not enough of us think about in our career choices - am I going to a place that practices fire drills or engineering?

[+] Lich|4 years ago|reply
Spot on about reading a random blog or article. I think many engineers at work are under pressure to deliver, and are looking for quick solutions. They do a search and see a Medium post related to their problem written by someone who says they are an “<platform> developer at <company>”, and for some reason most readers see these authors as an authority in their domain (because why else would they be writing about it? /s), and just accept the blog post’s practices or conclusions. Once read an article a long time ago about Android’s async task , and some blog post claimed it should only be used for operations less than one second. I looked at the official documentation and while it did mention that it should be used for short operations, no where did it mention it should be less than one second specifically (unless I missed it). I saw the same advice mentioned by many other devs who referenced that same article.
[+] oreally|4 years ago|reply
To add on to this, the current most preached about "best practices" comes largely from a de-risking, never-ever fail point of view, ie. 'safety'. Unfortunately with such standards also comes a concept known as 'acccountability', ie. 'ass-covering' practices that provide little practical value.

This results in programmers no longer being able to iterate fast and having to rely on some third-party whose tradeoffs they don't understand, resulting in slow, bloated software and dissatisfied programmers.

[+] cptaj|4 years ago|reply
Managers excited about a new tech idea are probably the most destructive thing in the industry.
[+] yunohn|4 years ago|reply
> I saw too many junior developers read a rando blog article, then get a non-technical / semi-technical manager excited about something that made their life easier, even though it was by no means a good practice for the context at hand.

I’ve seen the other way around too - senior developers rejecting or pushing changes that they find convenient. IME it’s not to do with experience, rather stubbornness.

[+] dmalik|4 years ago|reply
> How can Software Engineers call themselves engineers when there’s no rules, governing bodies, or anything to stipulate what true Software Engineering is?

We call ourselves software developers in Canada.

According to Canadian engineering[1]: The "practice of engineering" means any act of planning, designing, composing, evaluating, advising, reporting, directing or supervising, or managing any of the foregoing, that requires the application of engineering principles, and that concerns the safeguarding of life, health, property, economic interests, the public welfare, or the environment.

To be considered a work of engineering, then, that a piece of software (or a software-intensive system) must meet two conditions:

1. The development of the software has required “the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software.”

2. There is a reasonable expectation that failure or inappropriate functioning of the system would result in harm to life, health, property, economic interests, the public welfare, or the environment

[1] - https://engineerscanada.ca/news-and-events/news/when-softwar...

[+] hyperman1|4 years ago|reply
There is value in predictability, even if it stems from someone else's preferences.

Current home wiring requires you put wires in the wall at (I think) between 10 and 20 cm from the border, and a small number of cm inside the wall. This means you only have to check that zone, and can use detectors, to find the wires.

My home is from the 1950's. Some wires go diagonally from top left to bottom right at the other side of the wall. We had great fun finding out where they were hiding.

Even if the new wires waste a lot more wiring and PVC tube, I vastly prefer them when redecorating or drilling holes.

[+] floverfelt|4 years ago|reply
100% that's true! I'm not saying developers having preferences and everybody abiding by them is a bad thing, more just that a lot of what constitutes "best practices" are often preferences and we should call them that.
[+] tannhaeuser|4 years ago|reply
1950's onwards they used ribbon cables close to surfaces for household wiring. It was best practice, just not today's.

As a freelancer jumping projects, I can't help but see parallels, in that you've really got to study code bases wrt the time of authorship to understand their particular idiosyncrasies.

In the 2010's, people believed in "REST" without considering the context in which these concepts were introduced (eg thin browser UIs, or generalizations thereof as a baseline). Customers, even highly capable devs, flaunt their REST best practices, yet see HATEOAS as optional and pretentious, failing to see the entire point of loose coupling and discovery, and engaging in zealotry about squeezing parameters into URLs instead. Or pretend to, to stop pointless discussions with mediocre, bureaucratic peers.

[+] AdrianB1|4 years ago|reply
In many countries this is coded (mandated by standards), it is not a best practice.
[+] charles_f|4 years ago|reply
I hate the expression "best practice", it's so, so often used by someone to justify applying cargo cult without actually understanding why.

"Hey why are you having a try-catch there, it seems like it's just gonna break our stack trace and we don't even graciously recover from it" - it's best practice

"Hey why are you using model/view/controller folders?" - it's best practice

"Why are you building microservices?" - best practice.

I got out of a code base with stylecop's settings set to max and treated as errors. Trailing white spaces, mandatory comments on everything, etc. The only exceptions are file size and method size. Files are often in the thousands of lines, and methods can reach that as well. 10 levels of nested if/else. So we're in an unmanageable code base, but at least we don't have trailing white spaces.

Most of the time it could be replaced by "tradition", "pattern", or "the way I've seen others do". If it's a tradition and most people agree to it, fine, might help with readability. If it's pattern, fine, tell me why it's applicable and helpful in our context. If you can't actually explain why you're doing something, maybe rethink it and educate yourself on the topic

[+] Sohcahtoa82|4 years ago|reply
> Java is infamous for its verbosity. [...]

This paragraph highlights something I've been saying for ages.

Most criticism of Java needs to be directed towards Java programmers and not the language itself.

The language allows you to simply make a class. You're not required to make an interface and then make a class that implements it, and yet, Java programmers do it anyways and then criticize the language for being verbose.

Getters and Setters? You probably don't need them. Classes can have public member variables.

Java programmers seem to have the hardest time understanding YAGNI.

[+] shagie|4 years ago|reply
> Getters and Setters? You probably don't need them. Classes can have public member variables.

The getter / setter debate and public fields is about exposing the internal implementation of the object (and thus making it unchangeable without breaking other code) and being able to reason about where the internals are used (if you do need to change them).

For example, if I've got a java.util.Date exposed as a public field and someone uses it, I can't change that later to a java.time.LocalDateTime unless I change all of the things using it. If this is a library that others are using it may mean a cascade of unknown changes.

If, on the other hand, the Date was exposed as a public Date getDate() { return date.clone(); } then I don't have to worry about it. When the internals are changed to LocalDateTime, then Date getDate() { return Date.from(local.atZone(ZoneId.systemDefault()).toInstant()); } and everything will still work.

I don't have an issue with the infrequently used "default package private" level of field visibility where only classes in the same package can see that field as that limits the range of the changes to the classes within single package (in a single project).

[+] selfhoster11|4 years ago|reply
Thank you. I have programmed Java for years, but it's been frustrating. The clean language I learnt at university, that I love to bits, seems to be used approximately nowhere.

Instead, it's always some monstrosity held together by inheritance, XML/(awful) Gradle and liberal sprinklings of magical annotations that always (always!) bite you in the backside at runtime rather than at compile time, because why would you prefer to have good tooling that takes advantage of static typing to tell you what might go wrong.

I am appalled by what Spring mentality did to my language.

[+] bitwize|4 years ago|reply
> The language allows you to simply make a class. You're not required to make an interface and then make a class that implements it, and yet, Java programmers do it anyways and then criticize the language for being verbose.

That's the 'D' in SOLID -- dependency inversion. An object's dependencies should be defined in terms of abstract interfaces, not concrete classes. The verbosity is just boilerplate, and it would appear in any statically typed language if you're following SOLID principles.

> Getters and Setters? You probably don't need them. Classes can have public member variables.

Getters and setters are part of the JavaBeans spec for being able to load and configure arbitrary beans. If you're writing JavaBeans, as many EJB or Spring application developers are, you'll use getters and setters.

[+] Too|4 years ago|reply
Java was for very long missing anonymous functions, so for anything involving callbacks you had to make gazillion of Listener- or Factory-classes instead, which further forced you to create an interface for each of them. Leading to all these infamously verbose design-patterns.

It was also for a very long time missing some fundamental Stream, List and String processing features. Just getting the contents of a file or converting a string to bytes, or even just initializing an Array often required 2-3 lines of verbose OutputStream(StreamBuilder().fromArray(Arrays.asList(1,2))).readBytes().add(3), wrapped in a try-catch for all the checked exceptions. For things that in C# would just be a oneliner of new List<int>({1,2,3}) or in Python [1,2,3].

This is mostly solved in current versions of the language, but the reputation and the "best practice" design-patterns remains.

[+] avgcorrection|4 years ago|reply
> Getters and Setters? You probably don't need them. Classes can have public member variables.

We do need them if the objects interact with a lot the libraries that we use. Libraries that we didn’t make. They expect get/set.

I agree that getters/setters are very overrated. I tend to make public-final fields when I can. Most of the time I can’t though.

[+] rajacombinator|4 years ago|reply
Very true but Java devs tend to have limited perspective and flexibility. If you get rid of the cruft it’s a fine language.
[+] taeric|4 years ago|reply
Amusingly, j2ee did require you to make an interface and an implementation. I seem to recall you had to also have a "stub" class.

Which is to say, early ecosystems in Java certainly needed this criticism. They did adjust, though.

[+] floverfelt|4 years ago|reply
Yeah, I'm actually a pretty big fan of Java programming. It gets a lot of hate on HN for things that have been mostly solved or can be solved if you implement it a certain way.
[+] fitzn|4 years ago|reply
Steven Sinofsky gave a talk and said something to the effect of, we've been building roads, bridges and edifices for thousands of years. So, best practices and solved problems abound---and even then we still get it wrong sometimes. Whereas, software engineering is maybe 70 years old (generously)? So, there is much to learn and a lot of "baseline" knowledge that has yet to be established. I think it's a good way to think about things.
[+] habitue|4 years ago|reply
Eh, I mean we've been building computer chips for approximately the same amount of time as computer software, and it's pretty clear chip engineering is more like civil engineering than software engineering. I would guess many of the best practices in bridge building in the modern day were developed in the last 70 years.

I think it's that engineers of physical things have many more hard constraints they have to wrestle with, and software engineers largely don't. Your code doesn't need to obey the rules of gravity and chemistry and materials science, it just needs to somehow accomplish the task.

And you see those best practices in the places of software engineering where there are hard constraints: cryptography. high performance code. realtime systems.

It's not just a senior engineer's opinion whether you should use ruby or C if you're writing the firmware for your race car. If you use md5 to hash user passwords on a major site, you'll be hung from the rafters.

[+] sidlls|4 years ago|reply
Civil (and other engineering) got better because there was motivation to improve that came from multiple directions: literal lives at stake, the pride of good craftsmanship, iterative or even grand steps forward in knowledge, etc.

Software engineering as a discipline is dominated by appeals to authority ("Clean Code", "Google does it this way", "Djikstra said so", etc.) without any (or at least not much) attempt to ask why or whether. I think we'll automate away much of software engineering (likely with very poor, inefficient, and buggy implementations) before it matures enough as an industry to be actual engineering. Engineering (and the science behind it for that matter) advances from curiosity and a healthy skepticism, not the rampant ego-driven self-promotion that runs through SE.

[+] lou1306|4 years ago|reply
Furthermore, there are far fewer physical constraints in software, so the range of possible designs is dramatically wider.

(Actually I dare say that sotware itself has no physical constraints at all: software artifacts and software executions do.)

[+] afarrell|4 years ago|reply
Also, there isn't that much in the way of scientific grounding.

Mechanical engineering has physics as a foundation.

Chemical engineering has chemistry as a foundation.

What is the scientific foundation of software engineering?

I suspect it is a mix of cognitive science, linguistics, and anthropology.

[+] MarkLowenstein|4 years ago|reply
Programming changes practice quickly and often because it's cheap to do so compared to physical engineering which is slowed down by execution time, high materials cost, and sunk costs.

The interesting question is this: would other engineering pursuits (say civil) have just as much chaos and lack of authoritative practices, if changing practices would be equally fast and cheap for them?

[+] gambler|4 years ago|reply
Best practices in civil engineering are connected to outcomes. You know that something is a good practice if not doing it causes things to collapse or catch on fire.

"Best practices" in software engineering are usually about internal development processes and don't have any verifiable connection to outcomes.

In other words, we have two entirely different things labeled with the same name. People who commonly use the phrase "best practices" for software are literally trying to confuse you and generally are not worth listening to.

That said, some things in software are worth analyzing to have a better process, but those things need to be examined within a specific context. If someone claims that you need to, say, create an interface for every class, they should be able to explain why and how it is relevant to your work. If people make claims they cannot explain by connecting them to meaningful outcomes, those people are, again, not worth listening to. They might be mindlessly parroting something they have heard without having any clue as to the meaning of the practice. Unfortunately, our field is full of "professionals" that do exactly that.

Software is a pretty messed up domain that is frequently a self-licking ice-cream cone. (You write code to solve problems created by other code.) Because of that, it's often socially mediated, just like non-engineering fields. To establish anything for real in this self-referential environment you need to be able to have conversations about costs, tradeoffs and outcomes - within a specific context.

[+] elboru|4 years ago|reply
Right, but have you ever worked in a place where no practices at all are being followed? Where there are hacks after hacks, giant classes with giant methods, no interfaces, static methods, bad names everywhere, you need to find a bug? good look, you want to write a test to avoid regression? Ok that would take you 5 times more time.

I agree, following a “best practice” without understanding the “bad practice” that it’s trying to prevent is silly. But just discarding best practices because no one understands the reasons behind is also silly.

[+] Zababa|4 years ago|reply
I feel like at this point, comparaison between software engineers and civil engineers should fall under the "does not gratify intellectual curiosity" category. I haven't seen anything interesting said about this topic since the article series "are we really engineers?" https://www.hillelwayne.com/post/are-we-really-engineers/.

> At the end of the day, you wind up in a lot of fairly pointless arguments about tech stack and coding conventions that 99.9% of the time don’t make a bit of difference to the final product.

That's a great conclusion.

[+] KronisLV|4 years ago|reply
> You can’t do it “correctly” if “correct” is “whatever the guy who’s been here longer wants.”

To be honest, "correctly" is whatever actually works. But since we as an industry haven't been around for a long time and don't collectively know what actually works, it is simply the case that we defer to those who have comparatively more experience, so that we may get decent solutions now, as opposed to excellent ones later.

It took the ancient civilizations hundreds if not thousands of years to perfect architecture to a reasonable degree, where it's possible to be fairly certain about the viability of most typical designs, as well as the tradeoffs that nonstandard ones might require.

Personally, i think that we'll only be able to talk about what truly works and what doesn't once the industry cools down - when there are new JS frameworks once a month or once a year as opposed to every day (or for any technology, really). When there are very few breaking framework changes, because the frameworks would finally be stable. When the primitives around doing web development, interfacing with devices and everything else have been distilled to their most usable possible forms. When a developer doesn't need 10 languages or 20 frameworks to do their job, but can instead rely upon a few, after most others would have died off after a sort of singularity of languages.

That's when useful certifications would be possible. That's when estimates wouldn't be guesswork. That's when the education systems could actually prepare people fully for working in the industry. That's when the development of projects wouldn't be such Wild West, but instead would be a process with far higher quality, a true engineering discipline even at the expense of projects taking longer to actually develop, much like you don't construct buildings in an ad hoc fashion.

Now, whether that will happen in a 100 years, a 1000 years or never, is another question entirely.

[+] strulovich|4 years ago|reply
All the problems stated require more context:

- extract an interface or not -> an interface can speed up compilation (under the right conditions)

- pass more or less arguments -> how many call sites exist? What’s the performance difference? Would it complicate testing?

- testing -> depends on what you’re building, when do you need it by, and how problematic a bug may be (and more)

The reason it looks like some random preference is because over time we naturally gravitate (or been told to do) what works, and we keep choosing it heuristically since a thorough analysis of the problem is too costly or impossible.

Civil engineering has a bunch of hard rules on have X slack here because people will die for such mistakes. But I suspect you use best practices for less high stake stuff (do we use this outlet configuration or this one?) (actual civil engineers are welcome to correct me)

[+] 3pt14159|4 years ago|reply
Did structural engineering for a bit before coming back to software.

More or less right about hard rules. They're not truly hard, and it's different in different countries. Some countries have performance based rules, others have descriptive rules (must have 35mm of concrete cover over reinforcement bars) but even in those situations you can usually get some sort of person to overrule the rule, it's just costly in both time and money, but sometimes its worth it.

Take ultra high performance concrete for example. It's almost a different material compared to the normal stuff. But for run of the mill construction everyone kinda likes sticking to the rules because it's like outsourcing handling all the edge-cases. Kinda like using Postgres over writing your own persistence layer.

As for hard rules in software, that would be tough. It's much more multidimensional and maneuverable than civil engineering is and we have less history dealing with it. Also, there are way more security concerns than civil engineering, which is mostly about safety concerns, where there are no arms races against you.

We could still do it abstractly though. Favouring maxims and penalties over nitty gritty rules.

[+] web007|4 years ago|reply
Short answer: No, they're not.

Longer answer: I'm confused by the examples given. Nothing there is a best practice IMO, they are only opinions.

This writer is confusing "use git" or more basically "use version control" with the how's and why's underneath. Everything beyond "use it" is an opinion, with multiple "best" options depending on your desired workflow or requirements.

Same for Java - the senior explained why this version of "best" is best for them, but the real best practice of "define your interface as a contract" is hidden behind these opinions.

Contrast this with the civil engineer used as an example, who might say "use a safety factor of 3x rated weight for all fasteners" as their best practice. Using fasteners rated at 2x is a violation of that, but using screws vs bolts vs adhesive are all opinions and depend on the particulars and preferences of whoever is doing the work.

[+] StatsAreFun|4 years ago|reply
The two aren't mutually excusive. What I mean is, you can develop a preference for doing a software engineering task according to a best practice. Getting into the habit of using one or more time-tested practice(s) will only benefit you over time.

But, more fundamentally, what the author could have responded to their friend is: "Yes, we do have a code of ethics and professional practice standards[0]. We do have well-researched best practices and a rich standardized body of knowledge we can use[2]. We do have formally adopted standards through IEEE, ISO and IEC[1]."

I do get the sentiment behind the question though and personally wish we were more formally trained and held more rigorously to a professional set of standards. It does diminish the term "Software Engineer" as a engineering field when we use the title so loosely but (on some level) expect the same level of respect, pay and status as other engineers.

[0] https://ethics.acm.org/code-of-ethics/software-engineering-c... [1] https://en.wikipedia.org/wiki/ISO/IEC_JTC_1/SC_7 [2] https://www.computer.org/education/bodies-of-knowledge/softw...

[+] catern|4 years ago|reply
After reading this article, the theory that comes to mind is that "best practices" not just developer preferences - rather, "best practices" is cargo culting by mediocre programmers, trying to copy techniques that a really skilled programmer once employed to do amazing things.

This is the part of the article that made me think this:

>As I type this, I’m in a discussion about whether it’s better to pass a few unnecessary parameters to simplify a bash script’s internal logic or pass fewer parameters and make the bash script more complex.

This sounds like they're on the verge of parameterized, object-capability design - passing in the paths to operate on, rather than hardcoding some internal logic to determine what to do. If you do this in the right places and at the right time, you can do amazing stuff, reuse scripts in totally novel environments and for novel purposes. At other times, it's not necessary. But I'm guessing neither side really knows how to use that to do amazing stuff.

Some programmers don't have the skill to recognize when a specific technique is justified or unjustified. So, unmoored from technical reality, they just argue about cultural norms - blindly copy what someone else did once, which brought someone else success - cargo culting.

That all sounds very mean, but if this is what's actually going on, I'm not sure what the solution is. Maybe more focus on really impressive and amazing concrete projects, rather than just individual practices in isolation?

[+] PedroBatista|4 years ago|reply
Yes and no.

Given the context of the authors ( and this also includes employment context, socioeconomic and personality factors for everyone involved ), the best practice is something that has been "proved" to work well.

So no, it's not just a preference, but in practical terms many times it's a preference that some people agree it worked ( in their context ).

And let's not even talk about the "best practices" and technologies created just to sell you something. I think by now it's not a secret that most "developer evangelists" or even "famous" developers are just people doing sales and marketing, some of them don't even realize it such is their ego trip.

[+] agentultra|4 years ago|reply
In some cases, perhaps, but the _state of the art_ is always evolving and a lot of it is informed by how people practice software engineering in the wild.

Professional Engineer is a modern concept. In the United States it started in 1907. Sure we have been building irrigation systems for thousands of years but it wasn't regulated by governments until recent history.

Until then it was a practice guarded by guilds and seen as a craft.

Software isn't without its own history. The mathematicians and logicians had been working on it long before the first computing machine was built. We've known how to compute values for thousands of years.

Are we still in the craft/guild phase of the discipline? I don't think so. There are professional engineering organizations around the world that are certifying software engineers. If that is all that is required you can apply today.

The missing piece is that it's not required by governments around the world to have any professional affiliation or certification to practice programming professionally. Some argue this is a good thing as it keeps the playing field level. Others argue its bad as it enables profit-motivated companies to cut corners that harm businesses and users.

But as far as best practices go, as long as there are enough people practising them across the industry, then it's not simply a preference. A good guide on this is the SWEBOK [0] published by the IEEE which attempts to catalogue these practices in a living document that gets updated as the state of the art develops.

[0] https://en.wikipedia.org/wiki/Software_Engineering_Body_of_K...

[+] andy_ppp|4 years ago|reply
Make the simplest thing you possibly can, write lots of software tests and keep in mind that your code is a piece of communication to the next developer, like a story. Everything else is just vanity or preferences.
[+] glanzwulf|4 years ago|reply
I get your friend, and there's another point he can make: Imagine a world with self taught Civil Engineers.
[+] strken|4 years ago|reply
Aren't best practices things like "don't roll your own crypto if you can avoid it", "store your code in version control", or "write down migrations as well as up migrations"? These are all choices where the right option has been absorbed into the craft.

Civil engineering best practices are written in blood, but I think software engineering best practices can be stretched to include sweat and tears too.