top | item 16453076

The Y Combinator (2008)

98 points| pimeys | 8 years ago |mvanier.livejournal.com | reply

12 comments

order
[+] grdvnl|8 years ago|reply
That is a nice description of how to arrive at the Y-combinator.

There was a talk using similar examples which I think is very educational.

https://www.infoq.com/presentations/Y-Combinator

I wrote down the examples used in the video (given the poor quality of video) to help myself learn about the Y-Combinator

https://gist.github.com/gdevanla/9171085

Recently, I presented this topic to my team and wrote down a Python equivalent of this example.

https://gist.github.com/gdevanla/07a08d99e183f494d036c6d6fe6...

[+] pimeys|8 years ago|reply
Do you always need some lazy evaluation to arrive at the Y-combinator? It seems that even for the strict example a lambda was used to hold the recursive call evaluating too early.

Well, anyways, these are very interesting ideas to study and opened up some of the reasoning behind Haskell's laziness for me.

[+] tromp|8 years ago|reply
The Y combinator may be pictured as

    ────┬────
    ┬─┬ ┼─┬─┬
    └─┤ │ ├─┘
      │ ├─┘  
      └─┘
according to the graphical notation for lambda terms at http://tromp.github.io/cl/diagrams.html
[+] westoncb|8 years ago|reply
Also, check out this video of the lambda terms involved in computing factorial three evolving through beta reductions—to jazz music! It feels oddly fitting somehow...: https://www.youtube.com/watch?v=Bt11BsAZMq8

I'd be curious to hear anyone's thoughts on the graphical notation here—I'd never seen it (or anything similar), and it's intriguing to me, but I wonder if folks have gained some specific concrete benefit from it (e.g. you can look at one of these and figure out what's happening much more quickly or something).

[+] no_identd|8 years ago|reply
Speaking of "Recursion Without Really Recursing" and deep magic, check out this Scala-related paper, "Towards Strong Normalization for Dependent Object Types (DOT)":

http://drops.dagstuhl.de/opus/volltexte/2017/7276/pdf/LIPIcs...

Here, some choice quotes from it:

"We further showed that certain variants of DOT’s recursive self types can be integrated successfully while keeping the calculus strongly normalizing. This result is surprising, as traditional recursive types are known to make a language Turing-complete."

"What is the minimum required change to achieve Turing-completeness? Consistent with our expectations from traditional models of recursive types, we demonstrate that recursive type values are enough to encode diverging computation."

"But surprisingly, with only non-recursive type values via rule (TTYP), we can still add recursive self types to the calculus and maintain strong normalization."

"Can we still do anything useful with recursive self types if the creation of proper recursive type values is prohibited? Even in this setting, recursive self types enable a certain degree of F-bounded quantification, ..."

I wish someone would try to implement Robinson arithmetic (See here: https://en.wikipedia.org/wiki/Robinson_arithmetic and here: https://ncatlab.org/nlab/show/Robinson+arithmetic ) in this. I don't think it'd work - for what I speculate as somewhat obvious reasons - but it'd seem interesting to see where exactly doing so would fail.

[+] catnaroek|8 years ago|reply
> Strong typing simply means that every value in the language has one and only one type, whereas weak typing means that some values can have multiple types.

Aha! Now, any system with subtyping is “weakly typed”! For example, System F-sub!

[+] wgjordan|8 years ago|reply
> Even though we often refer to Y as "the" Y combinator, in actual fact there are an infinite number of Y combinators.

Drop the "the." Just "Y combinator." It's cleaner.

[+] inteleng|8 years ago|reply
Can you provide a reasonable justification for this?