WAT is classic and very funny, but I recommend all software engineers to watch all of Gary Bernhardt's talks on his website: https://www.destroyallsoftware.com/talks
His talk about "Boundaries" taught me one of the most illuminating concepts I've learned in my career, about separation of concerns, functional core & imperative shell, unit testing, etc., and radically changed the way I write code - the talk uses Ruby as an example, but can be applicable to any language under the sun.
In fact, watch all five of them, they all reach this outstanding level of being educational, interesting and funny at the same time.
> I recommend all software engineers to watch all of Gary Bernhardt's talks
They are all great talks, but please don't use "The Birth & Death of JavaScript" as an instruction manual!
In addition to the talks on his website, I also recommend watching "The Unix Chainsaw"[1]. It's a great introduction into using the full power of the unix shell as an interactive programming language.
I'd argue for making this mandatory in your onboarding. It can help so much more than random training sessions and seniors mentioning this in code review. It's a simple shift in thinking that most people should just get.
The core concept of immutable values as the boundaries between components now seems self evident, but I can recall not appreciating this concept earlier in my career. Further, I don't think one can fully appreciate the importance of this concept until they've had to develop, maintain, and operate code both before and after this principle is applied. Here’s my story of being educated in the importance of “Boundaries”.
I started my career at a relatively small company that grew rather quickly. In developing more robust and reliable systems, I experienced the importance of this principle first hand. Commonly, a system that started as a small, low-value prototype could grow into a substantially large and important engineering system. Initial functionality may have been implemented in brittle, mutating code and these quickly became a pain point, particularly in testing.
I can recall several times where one seemingly innocent Java static method that mutated inputs or static fields was repeatedly the source of production errors. Adding test cases for these situations got increasingly complicated and time consuming with more and more mocks, stubs, and inspection of mutated state. Further, changing and adding functionality of the code itself became a tribulation of mentally working through the complexity of code and existing tests.
And what do you know, our recent changes reintroduced an old bug despite all of the tests passing. Turns out our existing tests to catch that bug only handled a specific manifestation of the fault, but didn't capture other cases. So of course, let us add some more test cases, each with their own menagerie of mocks, stubs, and inspection of mutated state.
Eventually, an experienced engineer would guide me to refactoring this functionality into isolated components. A single static method could be replaced by several Java classes; some classes to hold immutable state and others to perform state transformation. We may even introduce an interface so that different functionality could be provided through polymorphism. Tests became simpler and more robust such that fewer faults were discovered in production.
From the outside this may appear to be the classic Java Architecture Astronauts menace with a single static method replaced by a collection of classes and interfaces. We may have even had some XFactoryProvider interfaces. Yet the end result was easier to reason about and test, with the tangible benefit of fewer errors in production.
And I tell this story only so that I can say that I now appreciate this talk even more after living through the application of the “Boundaries” principle.
The WAT talk was rhetorically nice because it managed to joke about weird corners of the language without going into ranty programmer mode, but opted for a more gentle style of bemused boggling.
It also kind of cemented for me that weird edge cases don't actually matter in practice (in my experience), and that a language can have lots to make fun of while still being lovely to use.
>It also kind of cemented for me that weird edge cases don't actually matter in practice
I had the exact opposite impression. Indeed, this stuff is the cause of so much that gets attributed to something else (e.g. JavaScript framework churn being attributed to culture).
The problems caused by these issues compound as you work your way up the stack.
If you don’t keep in mind all the quirks you create bugs and waste manhours. If you keep in mind all the quirks you reduce your cognitive capacity and waste manhours. Languages with quirks are categorically worse than languages without quirks.
> It also kind of cemented for me that weird edge cases don't actually matter in practice
Just look at the linux kernel. The tremendous amount of complexity and code need to support all the weird edge cases of hardware that does pretty much the same thing is mind boggling.
If you can ignore the edge cases yeah they don't matter, but as a language implementer these things matter. For the user, they are always present, when by mistake or by choice they are used, then they will matter.
So yeah, weird edge cases matter a lot! In fact trying to minimize them is very important to keep the complexity of systems down.
I had the opposite reaction. I thought the "wat" talk was entertaining of course, but I've found that it has generated a significant amount of hate towards JavaScript among laypeople who don't really understand that every language has weird caveats and edge cases like this.
It's a lot like this Order Of Operations quizes that make it around social media, with people who don't quite understand Order of Operations getting the wrong answer.
The obvious answer every engineer understands, as soon as it gets complex to work out the Order Of Operations, you make it explicit by adding parenthesis everywhere. Same thing with JS, once you have a chance for some of this weird behavior to show up, you make sure the types make sense.
it shows that at the fringes nobody in the audience correctly predicts what will happen...
anyways, here's the abductee-programming-language-conundrum:
consider Language X:
there is a nuclear power plant, and you or loved ones live within 10 miles. it is your responsibility to select the language for implementing the emergency shutdown procedures. would you use language X?
For something like that I would be more concerned about the tests than the language. In other words, I'd trust my own code not at all, in any language.
sph|4 years ago
His talk about "Boundaries" taught me one of the most illuminating concepts I've learned in my career, about separation of concerns, functional core & imperative shell, unit testing, etc., and radically changed the way I write code - the talk uses Ruby as an example, but can be applicable to any language under the sun.
In fact, watch all five of them, they all reach this outstanding level of being educational, interesting and funny at the same time.
pdkl95|4 years ago
They are all great talks, but please don't use "The Birth & Death of JavaScript" as an instruction manual!
In addition to the talks on his website, I also recommend watching "The Unix Chainsaw"[1]. It's a great introduction into using the full power of the unix shell as an interactive programming language.
[1] https://www.youtube.com/watch?v=ZQnyApKysg4
s3tz|4 years ago
hagy|4 years ago
The core concept of immutable values as the boundaries between components now seems self evident, but I can recall not appreciating this concept earlier in my career. Further, I don't think one can fully appreciate the importance of this concept until they've had to develop, maintain, and operate code both before and after this principle is applied. Here’s my story of being educated in the importance of “Boundaries”.
I started my career at a relatively small company that grew rather quickly. In developing more robust and reliable systems, I experienced the importance of this principle first hand. Commonly, a system that started as a small, low-value prototype could grow into a substantially large and important engineering system. Initial functionality may have been implemented in brittle, mutating code and these quickly became a pain point, particularly in testing.
I can recall several times where one seemingly innocent Java static method that mutated inputs or static fields was repeatedly the source of production errors. Adding test cases for these situations got increasingly complicated and time consuming with more and more mocks, stubs, and inspection of mutated state. Further, changing and adding functionality of the code itself became a tribulation of mentally working through the complexity of code and existing tests.
And what do you know, our recent changes reintroduced an old bug despite all of the tests passing. Turns out our existing tests to catch that bug only handled a specific manifestation of the fault, but didn't capture other cases. So of course, let us add some more test cases, each with their own menagerie of mocks, stubs, and inspection of mutated state.
Eventually, an experienced engineer would guide me to refactoring this functionality into isolated components. A single static method could be replaced by several Java classes; some classes to hold immutable state and others to perform state transformation. We may even introduce an interface so that different functionality could be provided through polymorphism. Tests became simpler and more robust such that fewer faults were discovered in production.
From the outside this may appear to be the classic Java Architecture Astronauts menace with a single static method replaced by a collection of classes and interfaces. We may have even had some XFactoryProvider interfaces. Yet the end result was easier to reason about and test, with the tangible benefit of fewer errors in production.
And I tell this story only so that I can say that I now appreciate this talk even more after living through the application of the “Boundaries” principle.
jan_Inkepa|4 years ago
It also kind of cemented for me that weird edge cases don't actually matter in practice (in my experience), and that a language can have lots to make fun of while still being lovely to use.
pydry|4 years ago
I had the exact opposite impression. Indeed, this stuff is the cause of so much that gets attributed to something else (e.g. JavaScript framework churn being attributed to culture).
The problems caused by these issues compound as you work your way up the stack.
keymone|4 years ago
miltondts|4 years ago
Just look at the linux kernel. The tremendous amount of complexity and code need to support all the weird edge cases of hardware that does pretty much the same thing is mind boggling.
If you can ignore the edge cases yeah they don't matter, but as a language implementer these things matter. For the user, they are always present, when by mistake or by choice they are used, then they will matter.
So yeah, weird edge cases matter a lot! In fact trying to minimize them is very important to keep the complexity of systems down.
shawnz|4 years ago
ItsMonkk|4 years ago
The obvious answer every engineer understands, as soon as it gets complex to work out the Order Of Operations, you make it explicit by adding parenthesis everywhere. Same thing with JS, once you have a chance for some of this weird behavior to show up, you make sure the types make sense.
unknown|4 years ago
[deleted]
jwilk|4 years ago
Discussed back then: https://news.ycombinator.com/item?id=3515845 (98 comments)
ksec|4 years ago
Missing 2012 from the Title.
I think with COVID, way deep in our subconscious. We are releasing all the emotionally attachment we have in our old programming tools.
abductee_hg|4 years ago
anyways, here's the abductee-programming-language-conundrum:
consider Language X: there is a nuclear power plant, and you or loved ones live within 10 miles. it is your responsibility to select the language for implementing the emergency shutdown procedures. would you use language X?
aflag|4 years ago
hirundo|4 years ago
spcebar|4 years ago
mhh__|4 years ago
blacktriangle|4 years ago
jwilk|4 years ago
As per HN Guidelines <https://news.ycombinator.com/newsguidelines.html>, please don't editorialize titles.
rPlayer6554|4 years ago
neonihil|4 years ago