Very cool article! I think it does a good job of diving into the tradeoffs of a dynamic type system versus a static one.
I guess the conclusion depends on what priorities you come to the table with. If your starting point is that 1) code must be optimally fast (so no unions are acceptable), and that 2) programmers cannot be trusted to keep track of the types they are using, and so must always explicitly opt-in to union-like types (like e.g. Rust does), then yes, you will naturally conclude that implicit unions are a bad design.
However, if your starting point is the observation that scientists and engineers overwhelmingly prefer dynamic languages for data analysis, because all the boilerplate caused by forcibly, explicit type handling, then you naturally conclude that anything _other_ than explicit union-types are a complete non-starter for a language for technical computing.
One could write a just as legitimate and valid blog post about the 'tradegy' of Rust's enum types, highlighting how they inevitably lead to boilerplate and extensibility issues.
I would also argue against the author's claim that it's unrealistically hard to reach type safety (really: avoiding unions) in Julia. If you write a Julia function with good performance, it will continue to have good performance in the future - union types will not show up spontaneously.
Sure, performance can regress if you refactor - just like in any other programming language. So you need to benchmark - like in every other language.
This feels a bit disingenious.
All the languages brought as an example need some sort of handling of the `not found` case. In C++ and Go you need to check against null pointers (or don't but then encounter segafults), Haskell and Rust you are forced to unwrap the value. C also has to check against the error code or incur in errors down the line or worse subtler logic errors.
Missing this types of checks is also source of errors in dynamic languages, adding `1+None` as well as `1+nothing` will return an error if not handled properly.
If you are absolutely sure your element will be in the array you have to encode it, for example
x = something(findindex([1,2,3], val)) # return value is Int now, or an error.
or even
x === nothing && error("oh no")
are enough for the compiler to correctly infer that x will just be an Int, and remove the Union.
Also, the claim that the only way to check for unfavourable Union types is to visually scan each function is just plainly false. There are many tools to do this sort of checks, just to name a few: @code_warntype, Cthulhu, DispatchDoctor
I do agree though that Julia has mainly users from academia, and therefore is less polished on the more pure software engineering aspects and tooling.
But the disclaimer at the end feels just like the author is dismissing the whole language on false premises due to frustration with lack of support/ecosystem for his specific use case.
jakobnissen|1 year ago
I guess the conclusion depends on what priorities you come to the table with. If your starting point is that 1) code must be optimally fast (so no unions are acceptable), and that 2) programmers cannot be trusted to keep track of the types they are using, and so must always explicitly opt-in to union-like types (like e.g. Rust does), then yes, you will naturally conclude that implicit unions are a bad design.
However, if your starting point is the observation that scientists and engineers overwhelmingly prefer dynamic languages for data analysis, because all the boilerplate caused by forcibly, explicit type handling, then you naturally conclude that anything _other_ than explicit union-types are a complete non-starter for a language for technical computing.
One could write a just as legitimate and valid blog post about the 'tradegy' of Rust's enum types, highlighting how they inevitably lead to boilerplate and extensibility issues.
I would also argue against the author's claim that it's unrealistically hard to reach type safety (really: avoiding unions) in Julia. If you write a Julia function with good performance, it will continue to have good performance in the future - union types will not show up spontaneously. Sure, performance can regress if you refactor - just like in any other programming language. So you need to benchmark - like in every other language.
FacelessJim|1 year ago
Missing this types of checks is also source of errors in dynamic languages, adding `1+None` as well as `1+nothing` will return an error if not handled properly. If you are absolutely sure your element will be in the array you have to encode it, for example
x = something(findindex([1,2,3], val)) # return value is Int now, or an error.
or even
x === nothing && error("oh no")
are enough for the compiler to correctly infer that x will just be an Int, and remove the Union.
Also, the claim that the only way to check for unfavourable Union types is to visually scan each function is just plainly false. There are many tools to do this sort of checks, just to name a few: @code_warntype, Cthulhu, DispatchDoctor
I do agree though that Julia has mainly users from academia, and therefore is less polished on the more pure software engineering aspects and tooling. But the disclaimer at the end feels just like the author is dismissing the whole language on false premises due to frustration with lack of support/ecosystem for his specific use case.
cjalmeida|1 year ago
Suggesting OOP as a solution is probably the most laughable “solution” though. What does this have to do with error handling and union type inference?