> My background is in OO programming, mostly using C#. C# being the versatile language it is, I have had the perception that whatever you do in other programming languages, you can with a little more code and hassle achieve in C# as well. If need be, I can program C# using a functional paradigm. And, of course I use recursion all the time. I know all there is to know about recursion.
IME there are two kinds of programmers: ones who learned a single language deeply and have almost religious confidence that it's the best thing ever, even if they try to approach other languages with an open mind (i don't blame them, i was like that a couple decades ago, too) and some who understand that tools are just that - tools, which you should pick accordingly to the problem you want to solve, because the other way ends up in a tail-wags-dog situation. the blog post sounds like the author has realized that and started a journey from the first type to the second type, which is to be commended. good job.
I see what you are saying but I have also seen plenty of Python code written by what are clearly Java developers. And Framework collectors who have learned learn the very basics of Django before they moved onto something new - and hence written a load of overcomplicated crap that could have been done a lot cleaner if they had learned the framework in more depth. Learning some things in depth definitely has benefits.
All that recursion, lazy evaluation, monads, lambdas, virtual methods and coroutines. In the end, they are all stack manipulation and jump instructions. High level languages are just a more convenient way of writing assembly.
I like to take that approach when comparing languages. What a program might do in term of machine instructions, and see how I can make another language output similar instructions. The one that can do it the more naturally wins.
i don't blame them, i was like that a couple decades ago, too
This seems like a fairly good indication there probably aren't really two kinds of programmers. Unless you experienced that particular change, from those particular initial conditions in some strikingly discrete fashion and also believe those are both universal.
So if you consider some stack to be inferior (maybe because its highly inconsistent in its design or it only runs on closed and locked down platforms or whatever) you still should just dismiss it as 'oh its just a tool' instead of accepting that it's shitty and makes you miserable when working with it? Why wouldn't I want to work with best thing ever if it helps me keep my sanity every day?
Do you want to use something that makes you miserable? Is any language including those designed by 3 year old 'just a tool' that is to be used despite obviously being shitty?
This was such a nice read! I've only ever written Java & JavaScript really, and Haskell is always at the top of the list of things I want to learn!
When I got to the 7 line snippet at the end I was blown away by the elegance, and the fact that I couldn't really understand exactly what each thing did. So I decided to spend a few hours going through it character by character and documented my process here [1] in case it was useful to anyone else.
I'll continue by reading a book on Haskell, but open to any advice from anyone on good ways to get into it!
IEnumerable<int> filter(int p, IEnumerable<int> xs) {
foreach(var j in xs)
if (j % p > 0) yield return j;
}
IEnumerable<int> sieve(IEnumerable<int> s) {
var p = s.First();
yield return p;
foreach(var e in sieve(filter(p,s.Skip(1))))
yield return e;
}
var n = sieve(Enumerable.Range(2, 10000)).Skip(10).First();
nth :: Integral a => Int -> Maybe a
nth n | n < 1 = Nothing
| True = Just (primes !! (n-1))
primes :: Integral a => [a]
primes = sieve [2..]
where sieve (p:xs) = p : sieve [x | x <- xs, x `mod` p /= 0]
"It is rather an attempt to get my head around functional programming, and to me Haskell doesn’t seem to have any practical application beyond that."
Cardano/ADA's core Ouroboros protocol was entirely written, with formal proofs, in Haskell. It is by far the most serious attempt at proof of stake in the crypto industry.
I think what you're really saying is, you won't find many jobs out there w/Haskell as a requirement... but I wouldn't go as far as saying Haskell has no purpose past teaching FP. It certainly is the poster child of FP, however it is not without practical/industrial use. You just have to look harder to see where it's being used.
>You just have to look harder to see where it's being used.
This is where advocacy slips over the line into a kind of blind faith evangelism -- with an added pinch of pedantry peculiar to our field.
I will state without proof that every language ever invented is currently being used for something practical somewhere. E.g. someone has a useful shell utility they wrote in Brainfuck that they run every day and that they love. This does not mean that BF has 'practical application' in the usual sense of the phrase. In the same way, just because there is a real program written in Haskell somewhere does not mean it has practical application.
IMHO a language (or a tool in general) should have a community of practitioners that regularly think in and create with the tool, and a good signal that that is happening are the number of open-source releases in that language. I have never in my life installed a Haskell program, or known of anyone installing one. (The exception being this one Haskell fan at JPL who loved talking about the language and the lambda calculas, but to my knowledge, never wrote anything real in it, and he just installed learning toys, not utilities.)
All that said, there are some good examples of unpopular tools that enjoyed success later in life. Quaternions lost to vectors historically, but graphics programmers rediscovered their benefits and resurrected them. Maybe Haskell will have it's time, but that time is not now.
The crypto industry at its current stage is far from any practical purpose. It's just unfortunate that "Haskell in Practice" is associated with crypto BS.
One of the most beautiful introduction to recursion is in Chapter 8 using dragon stories, of the `Common Lisp: A Gentle Introduction to Symbolic Computation` book by David S. Touretzky. [1], is the link to the free pdf.
(and yes, the nested yield is ABSOLUTELY an anti-pattern. They did promise to do a bulk yield at some point, I haven't used C# in a few years so don't know if I'm out of date!)
Depending on the particular language and platform, it can be quite dangerous to use recursion in production software due to the risk of a stack overflow. To be safe you have to first determine analytically that this can never happen. For algorithms that manipulate tree data structures it's often safer to avoid real recursion and instead sort of simulate recursion using a list or stack data structure allocated on the heap. At least that gives you a better opportunity to fail gracefully if the input data is too large to process within your resource constraints.
Tbf most code doesn't encounter trees deep enough to generate a stack overflow.
It's more of a concern with things like mutual recursion (multiplying your frame depth at each step, instead of just +1) or if your recursion depth is based on unbounded user input (eg iterating over an AST, or this code snippet)
I was thinking about this recently: is there ever a reason to put recursion in an everyday “workman” code base?
Seems like it would be so out of place in a real industry code base, like a infinite loop waiting to happen. There are always better more readable and maintainable ways to accomplish the same thing.
Sure there is: it lets you express loops immutably.
Rather than `state := null; while condition do: mutate-state-and-recompute-condition`, you can do `let loop(state) = if shouldContinue(condition) then loop(newState) else resultOfTheLoop`. Rely on the tail-call optimiser to compile this to a genuine imperative loop.
This looks very odd the first few times you see it, but it's much harder to get wrong.
There are contexts where it's banned (MISRA, and other embedded scenarios). But it's very hard to work with tree structures without recursion - if you're not careful you just end up with an explicit stack rather than using the software stack.
In situations where you think auto-vectorisation might help, you definitely want to do it as iteration.
This is partly why I like the system of transformers that LINQ is built out of; you can specify your query in a natural nice functional manner, and then let the optimizer convert it into a fast query.
Loops are also infinite loops waiting to happen, yet we still use them :)
The reason one might want to use recursion is when you're working on a data structure that is recursive in shape, like trees. And trees are very common, the Hacker News comments are one such example.
Good developers can solve problems and be productive with any programming language/tools. Bad developers can't solve anything unless they use the one specific tool they know.
Clearly much worse than the original solution. Basically, a nested sequence of filters is built, one to filter out multiples of each prime number. One way in which the complexity of this is bad is that if you want prime numbers < N you only need to filter out primes smaller than sqrt(N). It is telling that the first solution contains an sqrt but not the second one....
These kind of functional solutions are very cute mathematically, but...
I once wrote this way of generating an infinite list of primes in unlambda. That was kind of interesting to do.
I'd say O(n) for both. The generated list only contains primes, so that's straightforward linear. Then there is the recursive sieving, which adds an additional pass over the sieve list for every prime we've found so far, we need keep the computation of each pass in memory, so that's some additional memory that's also linear.
The time complexity is less obvious, each prime adds an additional pass to evaluate, so that seems linear, but each new pass only applies to a list that already had all the previous passes filtered out, so that's not technically linear, but I find it hard to determine how much it actually is.
[+] [-] baq|6 years ago|reply
IME there are two kinds of programmers: ones who learned a single language deeply and have almost religious confidence that it's the best thing ever, even if they try to approach other languages with an open mind (i don't blame them, i was like that a couple decades ago, too) and some who understand that tools are just that - tools, which you should pick accordingly to the problem you want to solve, because the other way ends up in a tail-wags-dog situation. the blog post sounds like the author has realized that and started a journey from the first type to the second type, which is to be commended. good job.
[+] [-] collyw|6 years ago|reply
I see what you are saying but I have also seen plenty of Python code written by what are clearly Java developers. And Framework collectors who have learned learn the very basics of Django before they moved onto something new - and hence written a load of overcomplicated crap that could have been done a lot cleaner if they had learned the framework in more depth. Learning some things in depth definitely has benefits.
[+] [-] GuB-42|6 years ago|reply
All that recursion, lazy evaluation, monads, lambdas, virtual methods and coroutines. In the end, they are all stack manipulation and jump instructions. High level languages are just a more convenient way of writing assembly.
I like to take that approach when comparing languages. What a program might do in term of machine instructions, and see how I can make another language output similar instructions. The one that can do it the more naturally wins.
[+] [-] pvg|6 years ago|reply
This seems like a fairly good indication there probably aren't really two kinds of programmers. Unless you experienced that particular change, from those particular initial conditions in some strikingly discrete fashion and also believe those are both universal.
[+] [-] im3w1l|6 years ago|reply
[+] [-] gear54rus|6 years ago|reply
Do you want to use something that makes you miserable? Is any language including those designed by 3 year old 'just a tool' that is to be used despite obviously being shitty?
Never understood that argument.
[+] [-] cousin_it|6 years ago|reply
https://www.cs.hmc.edu/~oneill/papers/Sieve-JFP.pdf
Melissa O'Neill shows that the typical FP sieve has worse big-O complexity than imperative.
https://wiki.haskell.org/Prime_numbers#Sieve_of_Eratosthenes
The Haskell wiki gives several sieve implementations and can't figure out the big-O complexity. (The imperative sieve is O(n log log n).)
[+] [-] aurbano|6 years ago|reply
When I got to the 7 line snippet at the end I was blown away by the elegance, and the fact that I couldn't really understand exactly what each thing did. So I decided to spend a few hours going through it character by character and documented my process here [1] in case it was useful to anyone else.
I'll continue by reading a book on Haskell, but open to any advice from anyone on good ways to get into it!
[1] https://aurbano.eu/note/2019-12-18-journey-into-haskell/
[+] [-] mettamage|6 years ago|reply
Quite casual, just what I needed. Thanks for writing it up!
One blogging tip (note: the rest was perfect): if you would've chopped the following in 2 or 3 lines, then the casual style would be perfect.
> primes n = take n $ sieve [2..] where sieve (p:xs) = p : sieve [x | x <- xs, x `rem` p > 0] sieve [] = []
I feel that it'd be really hard to do though.
[+] [-] d--b|6 years ago|reply
[+] [-] jayrwren|6 years ago|reply
[+] [-] m0skit0|6 years ago|reply
[+] [-] lostmsu|6 years ago|reply
[+] [-] tromp|6 years ago|reply
[1] https://www.cs.hmc.edu/~oneill/papers/Sieve-JFP.pdf
[+] [-] martin1975|6 years ago|reply
Cardano/ADA's core Ouroboros protocol was entirely written, with formal proofs, in Haskell. It is by far the most serious attempt at proof of stake in the crypto industry.
I think what you're really saying is, you won't find many jobs out there w/Haskell as a requirement... but I wouldn't go as far as saying Haskell has no purpose past teaching FP. It certainly is the poster child of FP, however it is not without practical/industrial use. You just have to look harder to see where it's being used.
[+] [-] javajosh|6 years ago|reply
This is where advocacy slips over the line into a kind of blind faith evangelism -- with an added pinch of pedantry peculiar to our field.
I will state without proof that every language ever invented is currently being used for something practical somewhere. E.g. someone has a useful shell utility they wrote in Brainfuck that they run every day and that they love. This does not mean that BF has 'practical application' in the usual sense of the phrase. In the same way, just because there is a real program written in Haskell somewhere does not mean it has practical application.
IMHO a language (or a tool in general) should have a community of practitioners that regularly think in and create with the tool, and a good signal that that is happening are the number of open-source releases in that language. I have never in my life installed a Haskell program, or known of anyone installing one. (The exception being this one Haskell fan at JPL who loved talking about the language and the lambda calculas, but to my knowledge, never wrote anything real in it, and he just installed learning toys, not utilities.)
All that said, there are some good examples of unpopular tools that enjoyed success later in life. Quaternions lost to vectors historically, but graphics programmers rediscovered their benefits and resurrected them. Maybe Haskell will have it's time, but that time is not now.
[+] [-] addictedcs|6 years ago|reply
[+] [-] tempestn|6 years ago|reply
[+] [-] pjc50|6 years ago|reply
[+] [-] deepaksurti|6 years ago|reply
[1] https://www.cs.cmu.edu/~dst/LispBook/book.pdf
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] leibnitz27|6 years ago|reply
[+] [-] leibnitz27|6 years ago|reply
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] nradov|6 years ago|reply
[+] [-] setr|6 years ago|reply
It's more of a concern with things like mutual recursion (multiplying your frame depth at each step, instead of just +1) or if your recursion depth is based on unbounded user input (eg iterating over an AST, or this code snippet)
[+] [-] snak|6 years ago|reply
[+] [-] butterisgood|6 years ago|reply
Here is a python version.
https://www.google.com/amp/s/www.geeksforgeeks.org/python-pr...
[+] [-] edflsafoiewq|6 years ago|reply
[+] [-] jpeg_hero|6 years ago|reply
Seems like it would be so out of place in a real industry code base, like a infinite loop waiting to happen. There are always better more readable and maintainable ways to accomplish the same thing.
[+] [-] Smaug123|6 years ago|reply
Rather than `state := null; while condition do: mutate-state-and-recompute-condition`, you can do `let loop(state) = if shouldContinue(condition) then loop(newState) else resultOfTheLoop`. Rely on the tail-call optimiser to compile this to a genuine imperative loop.
This looks very odd the first few times you see it, but it's much harder to get wrong.
[+] [-] pjc50|6 years ago|reply
In situations where you think auto-vectorisation might help, you definitely want to do it as iteration.
This is partly why I like the system of transformers that LINQ is built out of; you can specify your query in a natural nice functional manner, and then let the optimizer convert it into a fast query.
[+] [-] yen223|6 years ago|reply
The reason one might want to use recursion is when you're working on a data structure that is recursive in shape, like trees. And trees are very common, the Hacker News comments are one such example.
[+] [-] cjfd|6 years ago|reply
[+] [-] js8|6 years ago|reply
[+] [-] martin1975|6 years ago|reply
The second rule of recursion is we do not talk about recursion.
The third rule of recursion is without a base case, you have no recursion.
The fourth rule of recursion is it breaks you in two or more pieces.
[+] [-] mbrodersen|6 years ago|reply
[+] [-] meigwilym|6 years ago|reply
The unknown unknowns become known.
[+] [-] vincnetas|6 years ago|reply
[+] [-] millstone|6 years ago|reply
[+] [-] cjfd|6 years ago|reply
These kind of functional solutions are very cute mathematically, but...
I once wrote this way of generating an infinite list of primes in unlambda. That was kind of interesting to do.
[+] [-] merijnv|6 years ago|reply
The time complexity is less obvious, each prime adds an additional pass to evaluate, so that seems linear, but each new pass only applies to a list that already had all the previous passes filtered out, so that's not technically linear, but I find it hard to determine how much it actually is.
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] Rerarom|6 years ago|reply