(no title)
ninkendo | 19 days ago
> These languages are like the little Dutch boy sticking his fingers in the dike. Every time there’s a new kind of bug, we add a language feature to prevent that kind of bug. And so these languages accumulate more and more fingers in holes in dikes. The problem is, eventually you run out of fingers and toes.
I'm going to try to best to hide my rage with just how awful this whole article is, and try to focus my criticism. I can imagine that reasonable people can disagree as to whether `try` should be required to call a function that throws, or whether classes should be sealed by default.
But good god man, the null reference problem is so obvious, it's plain and simply a bug in the type system of every language that has it. There's basically no room for disagreement here: If a function accepts a String, and you can pass null to it, that's a hole in the type system. Because null can't be a String. It doesn't adhere to String's contract. If you try to call .length() on it (or whatever), your program crashes.
The only excuse we've had in the past is that expressing "optional" values is hard to do in a language that doesn't have sum types and generics. And although we could've always special-cased the concept of "Optional value of type T" in languages via special syntax (like Kotlin or Swift do, although they do have sum types and generics), no language seems to have done this... the only languages that seem to support Optionals are languages that do have sum types and generics. So I get it, it's "hard to do" for a language. And some languages value simplicity so much that it's not worth it to them.
But nowadays (and even in 2017) there's simply no excuse any more. If you can pass `null` to a function that expects a valid reference, that language is broken. Fixing this is not something you lump in with "adding a language feature for every class of bug", it's simply the correct behavior a language should have for references.
No comments yet.