(no title)
linschn | 9 months ago
On the other hand, I can't stop myself from thinking about "Greenspun's tenth rule":
> Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.
This doesn't apply directly here, as the features are intentional and it seems they are not bug ridden at all. But I get a nagging feeling of wanting to shout 'just use lisp!' when reading this.
upghost|9 months ago
Having written one of these[1] a decade ago and inflicting it (with the best of intentions) upon production code in anger, I can tell you this often leads to completely unmaintainable code. It is impossible to predict the effect of changing a method, tracing a method, debugging a method (where do I put the breakpoint??).
The code reads beautifully though. Pray you never have to change it.
The reason I say "just use haskell" instead of lisp is bc lisp generics suffer from this same problem.
Btw if anyone has a solution to this "generic/multidispatch maintainability in a dynamically typed language" problem I would love to hear it.
[1]: https://github.com/jjtolton/naga/blob/master/naga/tools.py
Fr0styMatt88|9 months ago
hasley|9 months ago
pansa2|9 months ago
Julia definitely made the right choice to implement operators in terms of double-dispatch - it’s straightforward to know what happens when you write `a + b`. Whereas in Python, the addition is turned into a complex set of rules to determine whether to call `a.__add__(b)` or `b.__radd__(a)` - and it can still get it wrong in some fairly simple cases, e.g. when `type(a)` and `type(b)` are sibling classes.
I wonder whether Python would have been better off implementing double-dispatch natively (especially for operators) - could it get most of the elegance of Julia without the complexity of full multiple-dispatch?
ethagnawl|9 months ago
cdrini|9 months ago
unknown|9 months ago
[deleted]