I've said it before and I'll say it again: go and learn Brainfuck. Just write a program that outputs your name or does FizzBuzz or something trivial, that's enough to get a grasp of it.
Brainfuck is to Turing machines what Lisp is to Lambda calculus. Minimum viable amount of syntactic sugar (and a bit of "semantic sugar" as well, it's nicer to work in bytes than bits) to make it usable as a programming language.
Lambda calculus is just a bit more practical formalism than Turing machines, that's why Lisp is genuinely useful as a programming language, unlike Brainfuck. That doesn't make learning about Turing machines any less useful as a brain teaser exercise.
BrainFuck is quite far removed from a Turing Machine.
In BrainFuck, the "data" (the memory cells being mutated by the program) are completely separate from the code. As a consequence, BrainFuck code cannot self-modify and it has to maintain two pointers (instruction pointer and memory pointer). BrainFuck also allows the instruction pointer to jump arbitrarily far (i.e. from a "]" back to the corresponding "[").
In constrast, a Turing Machine makes no separation between code and data; it just has a tape (the machine's transition table is more like a language's interpreter than a program). Not only can a program modify itself, for many tasks it must; since the tape only advances one cell at a time, moving between two arbitrary cells requires the contribution of all those in between; whenever we traverse a cell, we must also modify it in preparation for any subsequent traversal.
Whilst BrainFuck is certainly a nice idea, I don't think it's the best example of anything in particular.
I've always thought the Turing LISP was Forth. It's a tiny, stack machine with few ops plus some LISP capabilities. Also implemented in efficient hardware.
One of my favourite examples is the blank C program that won the "Worst Abuse of the Rules" award in the 1994 Obfuscated C Code contest (last entry here):
Can someone provide examples of programming forms, practices, or languages intentionally born esoteric that eventually turned mainstream? Nothing from this page popped out to me. It seems like sometimes an idea is so odd that it gets widespread attention just due to its oddness.
There's at least one on that list that's (at least in my experience) gone from relatively mainstream to esoteric - self-modification.
Back in the day when both memory and clock cycles were very precious, it wasn't unknown to use self-modifying code as a performance optimisation trick. I did it at least once in the late 80s, when I was working on comms software that had to be as fast as possible in order to avoid missing incoming data.
There was a check that needed to be done on every byte - I think it was whether I was now processing graphics characters or not - but the check was taking valuable time, and the value didn't change very often.
So the most efficient way I found to do it was to wait until I got a "switch to/from graphics" byte in the input stream and then update the instruction at a given location to either be "unconditional jump to graphics routine" or a "no operation (NOP)", which passed straight through to the routine for normal characters.
It was a horrible hack, but it worked.
Thankfully, I've not felt the need to even consider this approach for the past 20 years.
Object Oriented Programming? I'm only half-kidding/-serious, btw. But it seems to me that there are a lot of things that become mainstream after an initial bump of esoteria .. React? Lua? Isn't this sort of the norm for new technologies ..
Duff's Device comes close. It was born out of a real engineering problem, but spread as esoterica and I'm sure I'm far from the only one to learn it as esoterica and later use it in production.
UNIX was born as what you could consider a relatively esoteric response to the more popular MULTICS project. Same with B/C and BCPL. But I think that considering C and UNIX esoteric is a bit of a stretch, taking into account that they were developed at Bell Labs and almost everything related with computers could be considered esoteric back then.
Oh, I forgot the most prominent one as of late: Urbit, which started as a functional esoteric language with a Martian aesthetic... written by a guy famous for... other things... and... somehow hopped on the blockchain gravy train? last I heard it was... auctioning address space? for surprisingly large amounts of money?
INTERCAL is famous for having a COMEFROM instruction, as an inverse of GOTO.
With GOTO, we can read a section of code and if it contains "GOTO N" then we know the execution will jump to location "N", so we can look up that location and keep reading. We have no idea if any other code will jump into the code we're currently reading, unless we search for "GOTO <location we're reading>".
COMEFROM is the opposite: when we're reading a section of code and we see "COMEFROM N", we know that the code at location "N" will jump to this section. We have no idea if the code we're reading will jump to somewhere else, unless we search for "COMEFROM <location we're reading>".
Despite being invented as a joke, this is very popular in mainstream programming, under the name "exception handling" ;)
There was some unix utility (`yes`?) that was implemented on many systems as an empty shell script. I read about in in an HN comment, can't remember which.
Oh, and I wrote small web apps in Camping.rb. Working with _why's code was a great pleasure, especially in those rare cases when the documentation wasn't good enough and I had to dig through the code.
> I'd like to see one where the entire code is executed, or one done in Brainf*ck, as well.
I might be wrong, but this seems a really bad example for palindromes.
Aren't all palindrome strings made of `+-.,<>` valid brainfuck programs? I know that this subset is not turing-complete, but I also think that any valid brainfuck program with `[]` can't be a palindrome, which again would make this language a bad example.
Not to give excuses to his occasional behavior but he has his good moments. Some of his writings and interviews are pretty genious. Not to mention his projects, I sometimes enjoy studying them.
You gotta give him some benefit of doubt. He has a genuine, diagnosed mental illness. And even if he did say racist, misogynist and other derogatory things, some of it was years ago and he might have changed as a person (and hopefully for the better). The memory of the Internet is sometimes too long.
I gotta agree with him about one thing: computers are best when they have 16 colors (I spend most of my time in xterm with my favorite sixteen colors).
Ignoring the guy himself, I think the unique thing about it is why he did it. The motivation to achieve spiritual goals through programming is one that doesn't get tapped a lot despite its potential to bring in contributions. There's definitely software developers making products to serve religious markets. I'm talking more on tapping into that desire to make a difference and do positive things with one's time with extra layer of benefits from religious belief.
The moderators or leaders need above-average talent for such places, though. ;)
that guy is insane - it's a constant reminder to never get too overzealous about the value of your work. But I do have to admit, gotta love his intensity.... but he's a bigot so fuck him lol.
Funny how non-imperative programming languages, so for example FP, are in the "esoteric" group. Looking at the industry today it no longer seems to be true.
[+] [-] exDM69|9 years ago|reply
Brainfuck is to Turing machines what Lisp is to Lambda calculus. Minimum viable amount of syntactic sugar (and a bit of "semantic sugar" as well, it's nicer to work in bytes than bits) to make it usable as a programming language.
Lambda calculus is just a bit more practical formalism than Turing machines, that's why Lisp is genuinely useful as a programming language, unlike Brainfuck. That doesn't make learning about Turing machines any less useful as a brain teaser exercise.
[+] [-] tromp|9 years ago|reply
It's more like what Binary Lambda Calculus is to lambda calculus. See my IOCCC entry
http://www.ioccc.org/2012/tromp/tromp.c
http://www.ioccc.org/2012/tromp/hint.html
which happens to include a Brainfuck interpreter...
[+] [-] YeGoblynQueenne|9 years ago|reply
Indeed, the lambda calculus was never meant as a real programming language.
[+] [-] chriswarbo|9 years ago|reply
In BrainFuck, the "data" (the memory cells being mutated by the program) are completely separate from the code. As a consequence, BrainFuck code cannot self-modify and it has to maintain two pointers (instruction pointer and memory pointer). BrainFuck also allows the instruction pointer to jump arbitrarily far (i.e. from a "]" back to the corresponding "[").
In constrast, a Turing Machine makes no separation between code and data; it just has a tape (the machine's transition table is more like a language's interpreter than a program). Not only can a program modify itself, for many tasks it must; since the tape only advances one cell at a time, moving between two arbitrary cells requires the contribution of all those in between; whenever we traverse a cell, we must also modify it in preparation for any subsequent traversal.
Whilst BrainFuck is certainly a nice idea, I don't think it's the best example of anything in particular.
I wrote up some thoughts on this a few years ago http://chriswarbo.net/blog/2014-12-22-minimal_languages.html
[+] [-] MaulingMonkey|9 years ago|reply
http://maulingmonkey.com/brainfuck/
Ahh... it's a work in progress. I should probably add an option to disable my optimization passes, for starters...
[+] [-] nickpsecurity|9 years ago|reply
[+] [-] talideon|9 years ago|reply
[+] [-] yoha|9 years ago|reply
However, it seems like the page linked in the post is not available anymore.
[+] [-] robert_tweed|9 years ago|reply
http://web.archive.org/web/20020615054101/http://www.nyx.net...
It's a Quine, a Palindrome and a Null Program all in one!
[+] [-] jnordwick|9 years ago|reply
https://en.wikipedia.org/wiki/K_%28programming_language%29
https://en.wikipedia.org/wiki/J_%28programming_language%29
[+] [-] JoeDaDude|9 years ago|reply
[+] [-] no_protocol|9 years ago|reply
[+] [-] prof_hobart|9 years ago|reply
Back in the day when both memory and clock cycles were very precious, it wasn't unknown to use self-modifying code as a performance optimisation trick. I did it at least once in the late 80s, when I was working on comms software that had to be as fast as possible in order to avoid missing incoming data.
There was a check that needed to be done on every byte - I think it was whether I was now processing graphics characters or not - but the check was taking valuable time, and the value didn't change very often.
So the most efficient way I found to do it was to wait until I got a "switch to/from graphics" byte in the input stream and then update the instruction at a given location to either be "unconditional jump to graphics routine" or a "no operation (NOP)", which passed straight through to the routine for normal characters.
It was a horrible hack, but it worked.
Thankfully, I've not felt the need to even consider this approach for the past 20 years.
[+] [-] qznc|9 years ago|reply
It is mainstream now that it is integrated into Java, C#, etc.
It was considered esoteric 20 years ago. Haskell initially had the slogan "avoid success at all costs".
SQL, HTML, CSS were probably esoteric at some point.
[+] [-] nl|9 years ago|reply
Five or so years ago it as an esoteric language for statisticians, which had been around forever (especially if you count the predecessor S language).
Now the IEEE ranks it the 5th(!) most important (or something?) language[1], and it's the fastest growing language on StackOverflow[2].
That's a pretty amazing change for a language which has few redeeming factors (ha!) as a language, apart from lots of very useful libraries.
[1] http://blog.revolutionanalytics.com/2016/07/r-moves-up-to-5t...
[2] http://jkunst.com/r/what-do-we-ask-in-stackoverflow/
[+] [-] fit2rule|9 years ago|reply
[+] [-] SonOfLilit|9 years ago|reply
Duff's Device comes close. It was born out of a real engineering problem, but spread as esoterica and I'm sure I'm far from the only one to learn it as esoterica and later use it in production.
[+] [-] yiyus|9 years ago|reply
[+] [-] SonOfLilit|9 years ago|reply
[+] [-] chriswarbo|9 years ago|reply
With GOTO, we can read a section of code and if it contains "GOTO N" then we know the execution will jump to location "N", so we can look up that location and keep reading. We have no idea if any other code will jump into the code we're currently reading, unless we search for "GOTO <location we're reading>".
COMEFROM is the opposite: when we're reading a section of code and we see "COMEFROM N", we know that the code at location "N" will jump to this section. We have no idea if the code we're reading will jump to somewhere else, unless we search for "COMEFROM <location we're reading>".
Despite being invented as a joke, this is very popular in mainstream programming, under the name "exception handling" ;)
[+] [-] SonOfLilit|9 years ago|reply
[+] [-] SonOfLilit|9 years ago|reply
[+] [-] pmoriarty|9 years ago|reply
[+] [-] myst|9 years ago|reply
[+] [-] defen|9 years ago|reply
https://en.wikipedia.org/wiki/FRACTRAN
[+] [-] JD557|9 years ago|reply
> I'd like to see one where the entire code is executed, or one done in Brainf*ck, as well.
I might be wrong, but this seems a really bad example for palindromes. Aren't all palindrome strings made of `+-.,<>` valid brainfuck programs? I know that this subset is not turing-complete, but I also think that any valid brainfuck program with `[]` can't be a palindrome, which again would make this language a bad example.
[+] [-] yiyus|9 years ago|reply
[+] [-] talideon|9 years ago|reply
[+] [-] anotheryou|9 years ago|reply
[+] [-] exDM69|9 years ago|reply
You gotta give him some benefit of doubt. He has a genuine, diagnosed mental illness. And even if he did say racist, misogynist and other derogatory things, some of it was years ago and he might have changed as a person (and hopefully for the better). The memory of the Internet is sometimes too long.
I gotta agree with him about one thing: computers are best when they have 16 colors (I spend most of my time in xterm with my favorite sixteen colors).
[+] [-] nickpsecurity|9 years ago|reply
The moderators or leaders need above-average talent for such places, though. ;)
[+] [-] newman8r|9 years ago|reply
[+] [-] tener|9 years ago|reply
[+] [-] rajandatta|9 years ago|reply