(no title)
jmj | 5 days ago
This can be proved by induction. Or you can cite Craig's theorem (the less known one) for that. See [1]
Honestly, I don't see the endgame here.
[1] https://math.stackexchange.com/questions/839926/is-there-a-p...
jmj | 5 days ago
This can be proved by induction. Or you can cite Craig's theorem (the less known one) for that. See [1]
Honestly, I don't see the endgame here.
[1] https://math.stackexchange.com/questions/839926/is-there-a-p...
ezwoodland|5 days ago
jmj|4 days ago
I see the endgame now, thanks guys.
wasabi991011|4 days ago
This is somewhat covered in the first link of the background, a Modern Introduction to Combinators:
> So how might this work for S combinator expressions? Basically any sophisticated computation has to live on top of an infinite combinator growth process. Or, put another way, the computation has to exist as some kind of “transient” of potentially unbounded length, that in effect “modulates” the infinite growth “carrier”.
> One would set up a program by picking an appropriate combinator expression from the infinite collection that lead to infinite growth. Then the evolution of the combinator expression would “run” the program. And one would use some computationally bounded process (perhaps a bounded version of a tree automaton) to identify when the result of the computation is ready—and one would “read it out” by using some computationally bounded “decoder”.
I understand it as "you don't need the S combinator application sequence to halt for it to compute something".
v64|5 days ago
cvoss|5 days ago
Consider a computational model that, rather than work by successively rewriting an expression over and over in a way that honors some equivalence relation over expressions, it works by explicitly building the sequence of such expressions. In that kind of system, every computational state properly contains the previous state. Things grow and grow and never get "deleted". Yet such a system can clearly be universal.
tromp|5 days ago
unknown|5 days ago
[deleted]