Tangential question, but what Lisp is the most promising one to invest time on right now?
Common Lisp still has some activity, mainly coming from SBCL, but is a bit stagnant. It has a fantastic literature and many mature implementations.
Scheme is great, but a bit too fragmented. Racket seems to be gaining some momentum. The merge with Chez may be the tipping point to attract a critical mass of developers.
Clojure has many interesting modern ideas, but I feel being so tied to Java and the JVM has hurt a bit in the long run.
I miss a bit of innovation in the typed lisp area, like e.g. Qi or Shen which never took off. Carp [1] looks nice.
We use Common Lisp to build compilers and simulators for quantum computers at Rigetti [0,1].
I wouldn’t consider the development stagnant. SBCL makes regular releases, including huge improvements (e.g., ported to risc v) and bug fixes. It’s easy to find commercial support for Common Lisp as well.
I love Scheme and Common Lisp both, but CL is quite a workhorse when it comes to difficult, professional work on teams.
> ”Clojure has many interesting modern ideas, but I feel being so tied to Java and the JVM has hurt a bit in the long run.”
Would you expand on this? From my perspective, it actually opens up Clojure to a lot of opportunities where introducing, say, Common Lisp or Racket, would be a much harder lift. Clojure’s just a library, and immediately has available the entire Java ecosystem. Granted, it’s based only on my experience, but I’ve much more frequently seen first-class (provided by a vendor or otherwise officially sanctioned) Java libraries for services and other tooling than Common Lisp or Racket. Being able to take something like that off the shelf for integration is a big plus in my experience.
LispWorks and Allegro CL are commercially available Lisps since 30+ years, running on various platforms and still supported. The LispWorks 7.1 update from 2017 brought native ARM64 support (Linux, iOS), remote debugging of Android and iOS LispWorks, ...
You aren't tied to JVM with Clojure, you can target JS too. Outside browsers people are using ClojureScript for mobile apps, web app backends (eg with serverless), and local scripting.
There's also the CLR version that some people are using with Unity for game dev.
But I think JVM has really good prospects, even if the trend is currently favouring JS. Straddling both JS and JVM platforms is a pretty good position for Clojure.
There are great benefits to being able to leverage the JVM and NPM libraries. A lot of it happens indirectly, many libs are available that provide integrations to JVM/JS libs under the hood but present you with a Clojurey interface.
Julia? It depends on what you're looking for, but IMO apart from the non-lispy surface syntax (which Dylan also had), it's very much a Lisp with a focus on multiple dispatch.
There’s some interesting work going on in the Guile ecosystem: the new JIT compiler is just on the horizon, and nicely timed with Guix maturing as a platform. It’s lacking some real basic tooling (including an officially blessed package manager), but that gives its small community a chance to learn from the experience of earlier Lisps (including the first few years of Guile) and build something worthy of being the GNU projects’ language of choice.
> Common Lisp still has some activity, mainly coming from SBCL, but is a bit stagnant. It has a fantastic literature and many mature implementations.
There are a number of improvements I’d like to see to the core language, but in general I think Common Lisp is more ‘complete’ than it is ‘stagnant.’ I.e., it enables one to write software which is performant, dynamic, able to handle errors as they occur &c.
I recommend it over Scheme in large part because it is so complete, over Racket because it has multiple implementations and over Clojure because I like that it’s multi-paradigm. But tastes differ, and that’s okay.
I like that Common Lisp isn’t afflicted with flavour-of-the-month syndrome. There are libraries for Lisp which have been around, essentially untouched, for years and which still work correctly. Done right, the first time — what a concept!
Odds are high that any will be a fun investment. Common Lisp is interesting for just how much it already has. In particular, I'm finding many of the things that supposedly aren't great about it, to be quite productive if you embrace it. LOOP and FORMAT, in particular.
I've never used Common Lisp, I know some Clojure, used emacs-lisp, tried Racket, Fennel. I love Lisps. But I always wondered: What happened to Common Lisp? Some of the opinions I heard:
- CL is too big (compared to Racket and Clojure)
- Lisp-1 vs Lisp-2
- recursion discouraged (is that even true?)
- There is a thing called "Lisp Curse"
And then seemingly language's popularity decreased. Let's ignore the fact that I really like Lisps, aside that, are there any compelling reasons for an average Joe programmer like me to start learning Common Lisp in 2019? Honest question.
You pretty much answered your own question. Common Lisp is the big one. Scheme has more of a niche as an extension language, but Racket is a completely decent development environment and language family.
Clojure is a JVM language with some neat persistent data structures. Personally I find it to be one of the least interesting Lisps, but that's like saying it's not my favorite pizza. It's still quite tasty. Furthermore, I can easily get paid to write it, which is a little harder for the others.
It depends heavenly on what you want to do.
All of them SBCL, Racket and Clojure work platform independent, and have different strengths.
I'd choose it based on type of application.
I enjoyed the slides, and wish the talk video was available.
Now that I am retired, most of my side projects use SBCL, with a little Haskell and other languages. I have great respect for the commercial Franz and LispWorks products but for my projects SBCL works great.
I really enjoyed the history covered in his slides. I have lived through this history since 1982, but unfortunately almost 90% of my paid work has been in other languages. In our present time with great resources like Quicklisp, CL Cookbook, etc. I think that the CL ecosystem and number of deployed projects should be even greater.
General side note: incremental slides should be "normalized" in a print version. Not sure if any software supports this. It would probably afford some manual tagging or grouping.
The slides are a bit tricky to read without the rest of the accompanying talk. I couldn’t find a video with the brief search I had.
One thing the talk touches on is that writing a Lisp (cross-)compiler in Lisp is hard. The slides suggest a few reasons and here are some more.
In CL much of the compilation process requires evaluating CL code. It seems that you are therefore fortunate in using CL: you can just take the code and eval it. But this doesn’t work because (wanting to be portable and supportive of optimising compilers) many objects in CL can’t be inspected. For example the following is valid code:
(funcall #.(lambda (x) (1+ x)) 5)
In the code above one constructs a closure at read time, puts it into the ast and the compiler must take this call to an opaque host-platform closure (which evaluates to itself) and somehow marshal it into something for the target platform.
So to write a CL cross-compiler one must first implement a CL interpreter and this interpreter must have objects with enough useful state that one can correctly marshal closures from the interpreter into compiled objects in the target system, even when those closures close over shared values.
One way to reduce this pain is to restrict the compiler to only compiling a subset of the language, and then requiring that the compiler is written in that subset. One may only compile more complicated programs when the host and target are the same instance of Lisp.
However the “easily compiled portable subset” is probably too small and you will end up with either a long bootstrapping process of increasingly useful compilers or having to implement a lot of emulation of your target platform. An example in the slides is that the result of (byte ...) is implementation defined so you can’t just use the value from the host implementation. Another example is floats. In CL there are 4 float types which are allowed to be equal to one another in certain ways (typical modern implementations have 2 32-bit and 2 64-bit; others might have some being decimal or packed into 63 bits), so one cannot rely on the host’s floats behaving a certain way, so one instead has to emulate the target platform’s float implementation to get reliably portable results. Then again, maybe it is possible (but perhaps a bit painful) to write the compiler without using any floats.
Another bootstrapping difficultly in CL is it’s object system which just makes everything harder, especially if one wants to use objects for lots of the implementation-specific types.
SBCL goes for writing the compiler in a subset of CL and the result is indeed sanely bootstrapable. Other implementations typically require eval and some runtime written in eg C and a slow process of evaluating the improving compiler on itself to bootstrap the compiler. This is easier at first but can lead to difficulties (some of which are described in the talk).
> the compiler must take this call to an opaque host-platform closure (which evaluates to itself) and somehow marshal it into something for the target platform.
If we're cross-compiling, we are almost certainly doing file compilation whereby we have to externalize the compilation product to be transported to the target where it is loaded.
If we use ANSI Lisp file compilation as our source of requirements, then we only have to handle, as source code literals, objects which are "externalizable". Closures are not.
See 3.2.4 Literal Objects in Compiled Files
Of course, you can adopt externalizable closures as a requirement in your own compiler project, if you like.
Not sure what you intend to mean, but for a CL file compiler, that's not valid code. No CL file compiler needs to be able to externalize a function object.
[+] [-] nextos|6 years ago|reply
Common Lisp still has some activity, mainly coming from SBCL, but is a bit stagnant. It has a fantastic literature and many mature implementations.
Scheme is great, but a bit too fragmented. Racket seems to be gaining some momentum. The merge with Chez may be the tipping point to attract a critical mass of developers.
Clojure has many interesting modern ideas, but I feel being so tied to Java and the JVM has hurt a bit in the long run.
I miss a bit of innovation in the typed lisp area, like e.g. Qi or Shen which never took off. Carp [1] looks nice.
[1] https://github.com/carp-lang/Carp
[+] [-] reikonomusha|6 years ago|reply
I wouldn’t consider the development stagnant. SBCL makes regular releases, including huge improvements (e.g., ported to risc v) and bug fixes. It’s easy to find commercial support for Common Lisp as well.
I love Scheme and Common Lisp both, but CL is quite a workhorse when it comes to difficult, professional work on teams.
[0] https://github.com/rigetti/quilc
[1] https://github.com/rigetti/qvm
[+] [-] jpittis|6 years ago|reply
I've had a number of great experiences working on toy projects using Clojure but nevertheless feel resistance to adopt it fully "because of the JVM".
I've pinned my own resistance down to three things:
- Slow startup times. (Most of my use cases are not long running servers.)
- Not "unixy". (I write programs to run on linux and OSX exclusively and am used to using non-portable APIs maybe?)
- I've been lead to believe Java is "gross" and "enterprisy".
Really, of those three reasons only the first one has merit.
It kind of sounds silly when I put it this way: When I'm hacking on fun projects, I enjoy using a "hacker" language and Clojure doesn't feel like one.
One implementation you didn't mention that's pretty neat is Chicken Scheme [1].
[1] https://www.call-cc.org/
[+] [-] grzm|6 years ago|reply
Would you expand on this? From my perspective, it actually opens up Clojure to a lot of opportunities where introducing, say, Common Lisp or Racket, would be a much harder lift. Clojure’s just a library, and immediately has available the entire Java ecosystem. Granted, it’s based only on my experience, but I’ve much more frequently seen first-class (provided by a vendor or otherwise officially sanctioned) Java libraries for services and other tooling than Common Lisp or Racket. Being able to take something like that off the shelf for integration is a big plus in my experience.
[+] [-] lispm|6 years ago|reply
CLASP is a relatively new Common Lisp implementation with deep C++/LLVM integration.
https://github.com/clasp-developers/clasp
Quicklisp adds new libraries every month.
http://blog.quicklisp.org
LispWorks and Allegro CL are commercially available Lisps since 30+ years, running on various platforms and still supported. The LispWorks 7.1 update from 2017 brought native ARM64 support (Linux, iOS), remote debugging of Android and iOS LispWorks, ...
[+] [-] fulafel|6 years ago|reply
There's also the CLR version that some people are using with Unity for game dev.
But I think JVM has really good prospects, even if the trend is currently favouring JS. Straddling both JS and JVM platforms is a pretty good position for Clojure.
There are great benefits to being able to leverage the JVM and NPM libraries. A lot of it happens indirectly, many libs are available that provide integrations to JVM/JS libs under the hood but present you with a Clojurey interface.
[+] [-] ced|6 years ago|reply
[+] [-] dannyobrien|6 years ago|reply
[+] [-] rauhl|6 years ago|reply
There are a number of improvements I’d like to see to the core language, but in general I think Common Lisp is more ‘complete’ than it is ‘stagnant.’ I.e., it enables one to write software which is performant, dynamic, able to handle errors as they occur &c.
I recommend it over Scheme in large part because it is so complete, over Racket because it has multiple implementations and over Clojure because I like that it’s multi-paradigm. But tastes differ, and that’s okay.
I like that Common Lisp isn’t afflicted with flavour-of-the-month syndrome. There are libraries for Lisp which have been around, essentially untouched, for years and which still work correctly. Done right, the first time — what a concept!
[+] [-] taeric|6 years ago|reply
[+] [-] iLemming|6 years ago|reply
- CL is too big (compared to Racket and Clojure)
- Lisp-1 vs Lisp-2
- recursion discouraged (is that even true?)
- There is a thing called "Lisp Curse"
And then seemingly language's popularity decreased. Let's ignore the fact that I really like Lisps, aside that, are there any compelling reasons for an average Joe programmer like me to start learning Common Lisp in 2019? Honest question.
[+] [-] User23|6 years ago|reply
Clojure is a JVM language with some neat persistent data structures. Personally I find it to be one of the least interesting Lisps, but that's like saying it's not my favorite pizza. It's still quite tasty. Furthermore, I can easily get paid to write it, which is a little harder for the others.
[+] [-] xvilka|6 years ago|reply
[1] https://ultralisp.org/
[+] [-] sriharis|6 years ago|reply
[+] [-] gekkonier|6 years ago|reply
[+] [-] karmakaze|6 years ago|reply
For small fun projects CHICKEN Scheme (or Carp if you like).
Larger scale/production use either Clojure or SBCL.
[+] [-] mark_l_watson|6 years ago|reply
Now that I am retired, most of my side projects use SBCL, with a little Haskell and other languages. I have great respect for the commercial Franz and LispWorks products but for my projects SBCL works great.
I really enjoyed the history covered in his slides. I have lived through this history since 1982, but unfortunately almost 90% of my paid work has been in other languages. In our present time with great resources like Quicklisp, CL Cookbook, etc. I think that the CL ecosystem and number of deployed projects should be even greater.
[+] [-] auvi|6 years ago|reply
[+] [-] reikonomusha|6 years ago|reply
[+] [-] ofrzeta|6 years ago|reply
[+] [-] _emacsomancer_|6 years ago|reply
[+] [-] dan-robertson|6 years ago|reply
One thing the talk touches on is that writing a Lisp (cross-)compiler in Lisp is hard. The slides suggest a few reasons and here are some more.
In CL much of the compilation process requires evaluating CL code. It seems that you are therefore fortunate in using CL: you can just take the code and eval it. But this doesn’t work because (wanting to be portable and supportive of optimising compilers) many objects in CL can’t be inspected. For example the following is valid code:
In the code above one constructs a closure at read time, puts it into the ast and the compiler must take this call to an opaque host-platform closure (which evaluates to itself) and somehow marshal it into something for the target platform.Similarly one can do the following:
So to write a CL cross-compiler one must first implement a CL interpreter and this interpreter must have objects with enough useful state that one can correctly marshal closures from the interpreter into compiled objects in the target system, even when those closures close over shared values.One way to reduce this pain is to restrict the compiler to only compiling a subset of the language, and then requiring that the compiler is written in that subset. One may only compile more complicated programs when the host and target are the same instance of Lisp.
However the “easily compiled portable subset” is probably too small and you will end up with either a long bootstrapping process of increasingly useful compilers or having to implement a lot of emulation of your target platform. An example in the slides is that the result of (byte ...) is implementation defined so you can’t just use the value from the host implementation. Another example is floats. In CL there are 4 float types which are allowed to be equal to one another in certain ways (typical modern implementations have 2 32-bit and 2 64-bit; others might have some being decimal or packed into 63 bits), so one cannot rely on the host’s floats behaving a certain way, so one instead has to emulate the target platform’s float implementation to get reliably portable results. Then again, maybe it is possible (but perhaps a bit painful) to write the compiler without using any floats.
Another bootstrapping difficultly in CL is it’s object system which just makes everything harder, especially if one wants to use objects for lots of the implementation-specific types.
SBCL goes for writing the compiler in a subset of CL and the result is indeed sanely bootstrapable. Other implementations typically require eval and some runtime written in eg C and a slow process of evaluating the improving compiler on itself to bootstrap the compiler. This is easier at first but can lead to difficulties (some of which are described in the talk).
[+] [-] kazinator|6 years ago|reply
If we're cross-compiling, we are almost certainly doing file compilation whereby we have to externalize the compilation product to be transported to the target where it is loaded.
If we use ANSI Lisp file compilation as our source of requirements, then we only have to handle, as source code literals, objects which are "externalizable". Closures are not.
See 3.2.4 Literal Objects in Compiled Files
Of course, you can adopt externalizable closures as a requirement in your own compiler project, if you like.
[+] [-] lispm|6 years ago|reply
Not sure what you intend to mean, but for a CL file compiler, that's not valid code. No CL file compiler needs to be able to externalize a function object.
[+] [-] mike_ivanov|6 years ago|reply
[+] [-] ngcc_hk|6 years ago|reply
[deleted]