top | item 4765067

The Nature of Lisp

202 points| llambda | 13 years ago |defmacro.org | reply

77 comments

order
[+] akkartik|13 years ago|reply
Part of the problem is that lisp evangelism sets itself up to fail. An instantaneous blinding moment of enlightenment, would you like fries with that? Haven't they heard that you shouldn't start a joke with "This is the most hilarious thing ever"?

I've been doing lisp for several years now. I've built several interpreters. I've never had the enlightenment he describes. The minor epiphanies have been on par with oo design and unit tests. I've travelled far over the months, but it's closer to grok than zen.

[+] ChuckMcM|13 years ago|reply
Enlightenment epiphanies result in proselytizing. Can't be helped, its like tapping your knee with a rubber mallet makes your leg kick out.

The 'secret' or the thing that most people don't get early on when programming, is that code is data and data is code. A binary tree is data that is carefully surrounded by the semantics of the data's relationship with its peers. Reading the structure reads out the data in sorted order. Lisp just makes that painfully clear, that there is no distinction between state and semantics as far as computers are concerned and it allows you to move the 'computation' between data structures and algorithm at any point.

A grad student at USC explained it well when he described it like learning your third or fourth spoken language, Suddenly you brain "flips" from having three or four different ways of naming 'milk' into a single concept of milk with an infinite number of ways to identify it. The relationship between the root concept and the expression of that concept change precedence in your thought process.

Once you have made that switch you can write code in any computer language.

[+] aerique|13 years ago|reply
Nice to hear I'm not the only one. I've been using Lisp (mainly Common Lisp) for quite a few years now but never had that flash of enlightenment either. Discovering Lisp always seemed more like coming home: "Ah, this is how I always thought what programming was supposed to be like!"

No fighting with the compiler or being limited by what the PL designer thought you should do, just a pretty direct path from thought to code.

[+] gliese1337|13 years ago|reply
Reading the related article on writing a Lisp interpreter in Haskell (http://news.ycombinator.com/item?id=4764088) reminded me of my second blinding moment of enlightenment- understanding vau expressions. Things that can't be implemented as functions are typically things that require controlling the evaluation of arguments (conditionals, assignment, short-circuiting boolean operators, etc.), and additional language features (built-in special forms or macros for writing your own) are included to handle those. But if you have something that allows you to control the evaluation of arguments, simply choosing to evaluate all your arguments gives the equivalent of a function. Implement that thing, and your compiler/interpreter no longer needs to know about the difference between functions and macros and built-in forms; they're all the same thing!

There's not a lot of practical use for that kind of thing that I am aware of (implementing run-time macros is one, being able to pass short-circuiting boolean operators to map, reduce, etc. is another), but I strongly suspect that's just because we don't have 30 years of collective experience figuring out all of the great things about vau expressions like we have with Lisp and anonymous functions. The only language (discounting toy projects) I know of that actually implements them is Kernel (http://web.cs.wpi.edu/~jshutt/kernel.html).

[+] Peaker|13 years ago|reply
A language that is lazy by default lets you control the evaluation of arguments. Ordinarily, they're not evaluated, and if you force them they are.

However, macros are not just about whether to evaluate -- but about exposing the internal syntactic structure of the arguments.

In Haskell, using laziness you can implement control flow, short-circuiting, etc. If you want functions that work with the syntactic structure of their arguments, you need heavier machinery:

* Thick DSLs: Define explicit AST types and have a DSL that explicitly constructs those AST's.

* Template Haskell (the arguments' syntax has to be in ordinary Haskell)

* Quasiquotes (Need to parse strings)

I think the need for exposed syntax is relatively rare (e.g: a function that evaluates and shows a trace of the evaluation). In those cases, I think explicit AST types work pretty well, as Haskell has extremely light-weight syntax for constructing user-defined data types.

[+] sjmulder|13 years ago|reply
It would seem to me that some variety of Lisp would be the ideal candidate as a sort of runs-everywhere language, a thin portable base language that runs on top of different runtimes, offering easy integration with whichever it is running on.

Basically, something like a minimalist Clojure but not just for Java. It would be able to run atop the CLR, JavaScript or the Objective-C runtime as well. The interface with the host platform may be different, as long as the core language works everywhere. Ideally the core would be tiny.

[+] ChuckMcM|13 years ago|reply
Forth.

Constructing the basic machine is trivial, then the rest just comes along with it.

[+] danenania|13 years ago|reply
Clojurescript compiles seamlessly to javascript, which has become almost universal. It provides a beautiful, battle-ready lisp.
[+] bitops|13 years ago|reply
Very good article, though I doubt it'll convince the usual mass of unbelievers. (I love Lisp, for the record, though my primary exposure has been through Emacs Lisp - so shoot me).

A really great book that helps you get appreciate the concepts in Lisp, without really talking about Lisp directly too much, is "Patterns of Software" by Peter Gabriel. http://amzn.to/TxDKGG

I found it to be a very enlightening read. Definitely a book you have to sink into with plenty of time and quiet.

[+] blue1|13 years ago|reply
Richard Gabriel. Though Peter Gabriel would be good too :)
[+] qiemem|13 years ago|reply
For whatever it's worth, this article actually convinced me to take plunge when I first encountered it.
[+] Ingaz|13 years ago|reply
>>the concepts in Lisp, without really talking about Lisp directly too much

The book with the same feature is "ANTLR definitive guide".

When I read it I was like: "Ha! It sounds like LISP!", "Ha! It sounds like FORTH!", "Ha! It sounds like Prolog!"

And Terence Parr mentioned none of them.

Amazing!

[+] jacques_chester|13 years ago|reply
Please don't use link shorteners, especially ones which don't allow me to edit your link to remove the kickback if that's what I want to do.
[+] nnq|13 years ago|reply
...a bit offtopic, but I was wondering while reading the example of using C itself as the C preprocessor language: why don't languages provide the ability to do this kind of thing automagically, I mean marking some code to be executed at compile time and act as a code generation feature? (I know, it's easy enough to write a small preprocessor that does it, and it's just primitive string based macros, but having a standard way to do it baked into the building tools or the interpreter for an interpreted language seems ...neat ...even cool if some more "magic sauce" would be added to it to make these "macros" hygienic :) ).
[+] merijnv|13 years ago|reply
> why don't languages provide the ability to do this kind of thing automagically, I mean marking some code to be executed at compile time and act as a code generation feature?

There's certainly already languages that do this type of thing. Haskell has Template Haskell which lets you execute Haskell code at compile time to generate code. I'm pretty sure multiple ML's also have similar meta-programming features.

It works rather nice, actually.

[+] jon6|13 years ago|reply
To understand Lisp is to understand interpreters. With that understanding you can create domain specific languages which is extremely powerful.

But I wouldn't recommend using Lisp itself.. macros in particular are unhygienic.

[+] p4bl0|13 years ago|reply
Common Lisp != Lisp. You mean Common Lisp, Lisp is the family of languages (which also includes the Scheme sub-family, Racket, Clojure, Arc, Kernel…).
[+] eblume|13 years ago|reply
I am admittedly still a Lisp (et. al.) rookie, but isn't the entire point of Scheme that it introduces hygienic macros? Or are you referring to some other (perhaps sarcastic) notion of macro hygiene?
[+] projectileboy|13 years ago|reply
You can do cool stuff with unhygienic macros, however, like anaphoric macros. Interested readers should check out On Lisp by Paul Graham, as well as Let Over Lambda by Doug Hoyte.
[+] myoffe|13 years ago|reply
Also, this is a very interesting article on why lisp is unsuccessful in the "read world": http://www.winestockwebdesign.com/Essays/Lisp_Curse.html

I like the original article a lot, but what it failed to do for me is convince me why someone like me, a typical programmer, would want to choose Lisp over Python/Ruby/etc to solve a real world problem. Both Ruby and Python have powerful meta-programming abilities built into them. Lisp should be compared with these, not with C.

I still think that functional programming is extremely interesting (I'm in the long process of learning Haskell myself) and is useful is certain real world cases, but I was not convinced by this article. All the problems there are easily solved in modern and dynamic languages.

[+] secure|13 years ago|reply
I found that a rather good introduction to code as data, but I am not sure whether I am supposed to have been hit by the enlightenment he describes… :-)
[+] S4M|13 years ago|reply
Yes, for me the first time I read about the lisp syntax I was thinking:

"oh cool, it makes (+ 2 2) exactly equivalent to the syntaxic tree

     +

  /     \

 2       2
"

But I don't find it particularly enlightening and I still don't see what cool stuff you can do with macros that you can't do elsewhere.

[+] agumonkey|13 years ago|reply
Seems like Slava Akhmechet's worldwide celebration.
[+] ekm2|13 years ago|reply
Total ignoramus here:Does this "profound enlightenment" actually lead to profound execution?
[+] aerique|13 years ago|reply
As I've written in another comment I have never experienced the profound enlightenment but what do you mean with "profound execution"?

I can tell you that Common Lisp gets me the quickest results going from idea to prototype, it is a very practical language and doesn't get in the way. However a large part of this is experience. It was the most fun language to learn and apply to projects for me though.

[+] klibertp|13 years ago|reply
This is my third time reading this article; this time I stopped reading after a few paragraphs, but still skimmed it to refresh some things in memory. This is very good article, and one I would recommend to anyone to read, were it not for it's length - these days I guess half of the responses would be "tl;dr", sadly.

It's one of the articles that convinced me to take a look at Lisp a few years back, among others, which caused me to learn Scheme rather than Common Lisp or Emacs Lisp (I think Clojure was not around then yet). I invested half a year time to learn PLT Scheme/Racket and felt enlightened quite a few times along the way. First class continuations were the most mind blowing thing and I spent a few weeks trying to understand them. To prove to myself that I know what call/cc (or rather - it's delimited brethren) is all about I wrote python-style generators using them and this was one of the most rewarding experiences in programming for me.

Then I moved on, to Erlang IIRC, which was much easier to understand and use after being exposed to Scheme. In the following years I learned many more languages, all the while aiming for "purity" of the concepts and knowing full well that I won't be able to use any of them in real world. Many programmers would call Smalltalk a toy language - at best - but I had great time learning it and expanding my views on OOP, for example. I thought that the compromises that widely used languages make cause these languages to represent only a piece of what is possible, even if they are called "multi-paradigm", and wanted to explore more.

All this time I was writing commercial software in Python; I can't say if other languages I learned made me a better programmer - from the business perspective - but some really helped me expand my understanding of what I do. Forth and Lisp and Smalltalk did this and I was perfectly happy with stopping to use any of them after gaining some "enlightenment". They were not practical, not made for real world, they were there just to prove and demonstrate some kind of point, perspective.

This past week I couldn't work due to health problems and suddenly, after a few years of almost continuous work, I found myself bored. I thought, hell, why not? and went to reimplement a tiny bit of what I was working on earlier. I did this using Racket, my first "esoteric" language, so I had quite some things to relearn (good thing, too, because the language evolved in the meantime), but I finally (8 hours or so, in one go... they tell me it's not healthy to do this when you're ill, but it was fun) did it.

And it worked. And looked great. It was much shorter, more elegant and performant than Python. Certainly, half (or more) of this improvement came from me implementing the same thing the second time; but still, I was amazed at how easy and fun that was.

So the next day I decided to create another piece of code in Racket, this time a fresh one, which output would go straight into the larger system at work. It's good I had a task at hand which could be broken into pieces that small. And again, it worked, I did it in about the same time I would do this in Python, despite the lack of "concurrent.futures" or even thread-safe queue in Racket. I didn't use continuations or any other obscure features; just higher order functions and a few macros here and there to simplify error handling and such and some conveniences over pairs and lists.

I'm not sure what should I think about this situation. It's not a "proof of suitability" for the real world, of course - I'd need to write much more code to even begin to be able to claim that Racket is ok to use at work. But on the other hand I felt bad for ignoring really good language and environment for such a long time. I should have been trying to use it more and more often and I didn't because I thought it's not meant for that.

But above all, it was fun. Not because I was learning new stuff, like the first time, but because the language made it fun. And, what's almost as important, it worked - I have the code that does what it should be doing.

Well, I plan to try using Racket much more often from now on... Maybe someone else will give some Lisp a chance after reading this :)

[+] hasenj|13 years ago|reply
I passionately hate XML so this could not possibly resonate with me.

I never had the enlightenment he talks about. Actually I think that learning Lisp/Scheme might have made me a bit of a worse programmer in a way. It made me "dread" repetitive code so much to the point that I almost could not do anything with any language that's not highly dynamic.

Anyways.

I had 2 epiphanies with lisp.

1. Macros. Very powerful concept, but in practice difficult to use properly in your code. It's too difficult to reason about what's going on, like say, if you're maintaining or modifying a set of macros. I think it's more useful not as a construct that you would often use in your own code, but as a construct that's very useful for making libraries.

2. Continuations. This is not really related to lisp itself, and can be done in other languages, like javascript[0]. Understanding a continuation as an even higher level construct than closures .. and the fact that scheme had it built-in was very mind blowing for me.

It makes sense though that a lisp language must have it built-in. It's a concept that's very fundamental to the theory of computation, but in most programming languages it's not explicit at all.

Before continuations, I thought no lisp language can ever have equivalents of "break", "return", or "continue". After understanding continuations, I see that these constructs can built using continuations as a basic building block.

So this to me suggests that the concept of "continuation" is a very basic and fundamental concept that all students of Computer Science should be familiar with. Unfortunately I was never taught about it in University.

[0]: https://github.com/laverdet/node-fibers

[+] pg|13 years ago|reply
That "in practice" makes it sound like macros are so hard to understand that they're not worth using in real applications, which is definitely not true. Between the facts that (a) one uses them in deliberately restricted ways, (b) one gets increasingly familiar with them, and (c) they are are, token for token, way more powerful than ordinary code, macros end up being used a lot.
[+] qznc|13 years ago|reply
1. Macros. The Common Lisp version has this problems, but the Racket guys have hygenic macros, which are much better. I have probably not fully understood them, but the most important thing is imho to use lexical binding even inside macros.

2. Continuations. Higher level than closures? The real understand of continuations comes from the implementation imho. You implement your stack frames as a garbage collected tree data structure, instead of the C-way of a memory blob.

[+] dschiptsov|13 years ago|reply
Replacing XML with YAML will make it much more clear and much more shorter.

Concept of bindings (of symbols to values) and lexical scope (frames of the environment) must be described.

DSLs must be introduced to show how a list structure and uniform function application syntax glue everything together.

The much better advice - read SICP for Christ's sake.) People who wrote it spend much more time thinking what ideas to illustrate, in which order and why.

Then watch the Lectures, to feel the bliss.)

The true piece of mind comes after you finish reading On Lisp and then the contents of arc3.tar

Before that it is still just as being blinded and puzzled by a sudden flash of premature enlightenment.)

[+] diminish|13 years ago|reply
Yes, XML reminds the terrible XSL experience; YAML would be pythonesque.

1. To the beginners, we need to explain; why code/data unity opens up broad possibilities and separation is simplistic. Otherwise some people claim that the best LISP DSL you ll write, will end up separating your data from your code, and tell you the virtues of von Neumann architecture.

2. Also the macro expansion time, and run time separation for non-interpreted LISP seems to be a restriction; mainly if all macros are to be defined at design time and expanded at macro expansion time, the advantages of macros seem to be limited against languages without macros. Namely, macros seem to be a way to modularize code by generalizing, and simpler languages may do it with text editing and module/source code organization features.

[+] pmk|13 years ago|reply

[deleted]