You observe this with types. Dynamic types feel faster, at the REPL, when you’re coding at 1Hz, because you’re not factoring in the (unseen) cost of future bugs, the cost of refactors you won’t do because you don’t have the confidence to refactor which static types give you, the cost of legacy software that can’t be replaced because it can’t be understood by anyone, the cost of a Python server doing four requests per second while you pay five figures to AWS every month.
It's funny, in a few dynamically typed projects I've noticed people start treating some code as append-only. The fear of getting bitten means reinventing the wheel is a more palatable prospect than diving into the inexplicable horrors that are around the corner.
Of course, that 'a g i l i t y' of dynamic typing is impossible to give up. The devs must churn out code ASAP, maintenance be damned -- mostly because they probably won't be maintaining it.
Except this is bs because modern jit outperforms C and C++ compiled code in some cases. Dynamic types mean the constraints are known later. It doesn’t mean they’re never known. And when they’re known you can partially reduce.
I really hope just using static types alone doesn't give people confidence to refactor. It of course depends what "refactor" really means. Are we talking about a function, class, module/library, an entire application? Maybe in the function case, static type give more confidence, but in the others the way to de-risk is to have good tests. Static types only help catch one class of errors where refactoring can introduce several error classes depending on the specific case.
In other words, if static types alone are giving confidence to refactor, it is a false confidence except in the most trivial of cases.
Hardly pragmatic; this page seems to be little more than rah-rah Haskell evangelism. (And I like Haskell!).
Making it hard to YOLO your I/O does not seem to be paying off very well for Haskell; the cost in adoption often outweighs the gain in safety. Yes, Django codebases occasionally have bugs due to I/O happening at the wrong time, but that's pretty far down the list of causes of bugs in Django codebases.
Likewise the argument against macros seems to be driven by nothing more than personal taste. (And again, I dislike macros! But you have to actually make the argument why they're bad. Certainly my experience is that checking generated code into source control is worse, not better)
About macros: the author did argue why it's bad. It was actually the final point leading to the conclusion:
salience bias: measuring what is seen and not what is unseen.
And he did not say all macros are bad, giving the example of Common Lisp macros being good. Why? Because you can easily expand them (the language itself gives you the capability, not just an IDE... but of course with SLIME it's one shortcut away) and see the actual code you will get running.
Generally in agreement with the article. This bit was confusing to me, though.
> Moving with the arrow keys, erasing text by holding backspace for an eternity feels slow, but it is predictable and reliable. Vim or Emacs-style editing, where you fly through the buffer with keybindings, feels faster, but a single wrong keypress puts your editor in some unpredictable state it can be hard to recover from.
I've been using neovim for about a year now. I don't think I've ever entered a state from a mis-click that is hard to recover from yet. The worst case scenarios tend to result in something taking comparable time to if a mouse was used.
I agree with everything, apart from metaprogramming. I'd say the issues with metaprogramming are issues with tooling (that you cannot easily inspect / debug the generated code).
This is why I love Go. Go feels like a language that is boring on purpose - few magic short cuts, and few ways of misusing some special language feature. Just a lot of boring, regular code which is easy to refactor and maintain at scale (IMO).
It seems to me that the fundamental definition of a function/method should be a compile time executed function that generates a syntax tree always (assuming the compiler dog-foods itself or can parse and interpret it). Much like Python has metaclasses or Rust's procedural macros but it's just assumed by default.
It's kind of unfortunate that a language with managed effects & capabilities hasn't gone mainstream. Maybe it doesn't have the right ergonomics yet.
[+] [-] xupybd|2 years ago|reply
[+] [-] sk0g|2 years ago|reply
Of course, that 'a g i l i t y' of dynamic typing is impossible to give up. The devs must churn out code ASAP, maintenance be damned -- mostly because they probably won't be maintaining it.
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] BulgarianIdiot|2 years ago|reply
[+] [-] billythemaniam|2 years ago|reply
In other words, if static types alone are giving confidence to refactor, it is a false confidence except in the most trivial of cases.
[+] [-] lmm|2 years ago|reply
Making it hard to YOLO your I/O does not seem to be paying off very well for Haskell; the cost in adoption often outweighs the gain in safety. Yes, Django codebases occasionally have bugs due to I/O happening at the wrong time, but that's pretty far down the list of causes of bugs in Django codebases.
Likewise the argument against macros seems to be driven by nothing more than personal taste. (And again, I dislike macros! But you have to actually make the argument why they're bad. Certainly my experience is that checking generated code into source control is worse, not better)
[+] [-] dllthomas|2 years ago|reply
[+] [-] brabel|2 years ago|reply
[+] [-] yeck|2 years ago|reply
> Moving with the arrow keys, erasing text by holding backspace for an eternity feels slow, but it is predictable and reliable. Vim or Emacs-style editing, where you fly through the buffer with keybindings, feels faster, but a single wrong keypress puts your editor in some unpredictable state it can be hard to recover from.
I've been using neovim for about a year now. I don't think I've ever entered a state from a mis-click that is hard to recover from yet. The worst case scenarios tend to result in something taking comparable time to if a mouse was used.
[+] [-] noelwelsh|2 years ago|reply
[+] [-] liampulles|2 years ago|reply
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] jaxrtech|2 years ago|reply
It seems to me that the fundamental definition of a function/method should be a compile time executed function that generates a syntax tree always (assuming the compiler dog-foods itself or can parse and interpret it). Much like Python has metaclasses or Rust's procedural macros but it's just assumed by default.
It's kind of unfortunate that a language with managed effects & capabilities hasn't gone mainstream. Maybe it doesn't have the right ergonomics yet.
[+] [-] vmaurin|2 years ago|reply