I'm a bit surprised not to see any papers by Simon Peyton-Jones (of Haskell fame) on the list. I guess most of the papers on this list were old classics and many of the authors have already passed away or retired.
SPJ is one of the most relevant programming language researchers of the past 20 years, but maybe his work is "too new" for this list. Reading his books and papers taught sparked the interest in programming languages in me.
I think it lacks the property of being "a paper", but I'll happily nominate The Implementation of Functional Programming Languages Simon Peyton Jones, Prentice Hall 1987
Those are papers on the theory of programming languages, not implementation. As far as I know, none of the works of Peyton-Jones fit into that category.
Guy Lewis Steele Jr. RABBIT: A compiler for SCHEME ftp://publications.ai.mit.edu/ai-publications/pdf/AITR-474.pdf
OK, I think the first two sections are plenty. You can search for the rest yourself on Google Scholar (I found most of these as the first hit on Google Scholar, one or two took looking through a few hits or doing a Google search as well)
The problem with pay walled journals is that they devalue old but good papers by making them harder to get ahold of. Anyways, the really good ones will find their way onto the open web somehow, the merely good ones maybe not.
'Can Programming Be Liberated from the von. Neumann Style?' by John Backus is another great work very worth studying:
'Conventional programming languages are growing ever more enormous, but not stronger. Inherent defects at the most basic level cause them to be both fat and weak: their primitive word-at-a-time style of programming inherited from their common ancestor—the von Neumann computer, their close coupling of semantics to state transitions, their division of programming into a world of expressions and a world of statements, their inability to effectively use powerful combining forms for building new programs from existing ones, and their lack of useful mathematical properties for reasoning about programs.'
Thankfully, 30+ years later these problems are widely fixed ;)
The most recent paper in the first section was written in 1978.
The majority of the ones in the first two sections were written before I was born; only three were written in the 1990's.
The only one from the 21st century (published in 2000) is in the least-prestigious third section.
Does it take time for good ideas to accumulate the recognition needed to be included in this list? Or was all the low-hanging fruit quickly picked in the early days of the computer age, and only hard problems remain on which little progress has been made?
Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?
Or is language implementation, almost without exception, a series of engineering refinements that happen in a decades-long process after theoretical foundations are first published?
> Or was all the low-hanging fruit quickly picked in the early days of the computer age, and only hard problems remain on which little progress has been made?
Yes, most of the ground breaking theories were researched and documented decades ago. And some of that stuff has yet to make itself to mainstream programming languages (e.g. type inference).
> Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?
Apart from Lisp and Haskell, the "modern" languages you mention are rather boring when it comes to programming language design and there isn't too much research behind them. There has been a lot of practical work and research in garbage collection, run time systems, compiler back ends, just in time compilation and virtual machines. But when it comes to language design, the mainstream programming languages are really conservative.
The relationships between theoretical innovations and practical innovations is particularly indistinct in Computer Science.
A language like JavaScript might be interesting because it is simple and ubiquitous and embedded in a powerful application platform - but that does not mean it is of much interest from a theoretical CS perspective.
> Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?
Those modern languages are all older than 2000. Some of the papers cited in this list are indeed related to Haskell, Lisp (do you really consider lisp modern?). And Compiling works that will have a lasting impact needs some time to reflect on the importance of new works.
This list is essentially centered on logic in programming languages (lambda calculus, types theory, semantics, monads), hence the absence of python, ruby, javascript: those languages picked a lot from this theory but did not bring much novel material.
Ugh, if you look at the bottom of the pretty great works you will see Guy Steele's master's thesis where he defined the Scheme programming language and its compiler.
I know when I was writing my thesis I was pretty disgusted that he ruined it for all of us mere mortals. ;)
I think Claude Shannon's (of information theory fame) master's thesis beats out all others in the field of computer science.
"A Symbolic Analysis of Relay and Switching Circuits", 1938, proves that you can use electrical switches to do boolean algebra! He basically invented the digital computer, as a master's thesis.
That is wonderful. As a physicist whose first language was FORTRAN, I particularly like "It is a syntax error to write FORTRAN while not wearing a blue tie."
If anyone doesn't know, Benjamin Pierce is the author of a highly regarded book, Types and Programming Languages. (I'm yet to study it, its advanced supposedly...)
Only one paper by Filinski? I've been picking through 'Declarative Continuations and Categorical Duality' where he shows the duality between values and continuations by generalizing lambda calculus to a symmetric version. It's incredible beautiful.
My second remark is that our intellectual powers are rather geared to master static
relations and that our powers to visualize processes evolving in time are relatively
poorly developed. For that reason we should do (as wise programmers aware of our
limitations) our utmost to shorten the conceptual gap between the static program and
the dynamic process, to make the correspondence between the program (spread out in
text space) and the process (spread out in time) as trivial as possible.
From Dijkstra's "Go To Statement Considered Harmful"
[+] [-] exDM69|12 years ago|reply
SPJ is one of the most relevant programming language researchers of the past 20 years, but maybe his work is "too new" for this list. Reading his books and papers taught sparked the interest in programming languages in me.
[+] [-] marshray|12 years ago|reply
http://research.microsoft.com/en-us/um/people/simonpj/papers...
[+] [-] tome|12 years ago|reply
[+] [-] dscrd|12 years ago|reply
[+] [-] angersock|12 years ago|reply
A public vault of this articles so we can all learn easily instead of having to go through journal publishers.
EDIT: Why thank you, Lazyweb!
[+] [-] YAYERKA|12 years ago|reply
C. A. R. Hoare. An axiomatic basis for computer programming.
http://sunnyday.mit.edu/16.355/Hoare-CACM-69.pdf
Peter J. Landin. The next 700 programming languages.
http://www.cs.cmu.edu/~crary/819-f09/Landin66.pdf
Robin Milner. A theory of type polymorphism in programming.
http://courses.engr.illinois.edu/cs421/sp2012/project/milner...
Gordon Plotkin. Call-by-name, call-by-value, and the λ-calculus.
http://homepages.inf.ed.ac.uk/gdp/publications/cbn_cbv_lambd...
John C. Reynolds. Towards a theory of type structure.
http://www.cs.cmu.edu/~crary/819-f09/Reynolds74.pdf
[+] [-] lambda|12 years ago|reply
C. A. R. Hoare. An axiomatic basis for computer programming: http://sigpl.or.kr/school/2005w/slides/2_17_1.pdf
Peter J. Landin. The next 700 programming languages: http://www.thecorememory.com/Next_700.pdf
Robin Milner. A theory of type polymorphism in programming: http://courses.engr.illinois.edu/cs421/sp2012/project/milner...
Gordon Plotkin. Call-by-name, call-by-value, and the λ-calculus : http://www.sciencedirect.com/science/article/pii/03043975759... (PDF link at upper right of page)
John C. Reynolds. Towards a theory of type structure: http://repository.cmu.edu/cgi/viewcontent.cgi?article=2289&c...
Pretty great works:
Luca Cardelli. A semantics of multiple inheritance: http://lucacardelli.name/Papers/Inheritance.pdf
Luis Damas and Robin Milner. Principal type schemes for functional programs: http://web.cs.wpi.edu/~cs4536/c12/milner-damas_principal_typ...
Edsger W. Dijkstra. Recursive programming: http://oai.cwi.nl/oai/asset/9253/9253A.pdf
Edsger W. Dijkstra. Go to statement considered harmful: http://www.massey.ac.nz/~kahawick/159331/Goto-Harmful-Dijkst...
William A. Howard. The formulas-as-types notion of construction: http://www.cs.cmu.edu/~crary/819-f09/Howard80.pdf
Robert Kowalski. Predicate logic as programming language: http://65.99.230.10:81/collect/computer/index/assoc/HASH0183...
Peter J. Landin. The mechanical evaluation of expressions: http://ropas.snu.ac.kr/lib/dock/La1964.pdf
John McCarthy. Recursive functions of symbolic expressions and their computation by machine: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.111...
Eugenio Moggi. Computational lambda-calculus and monads: http://pdf.aminer.org/000/267/711/lazy_lambda_calculus_theor...
Greg Morrisett, David Walker, Karl Crary, and Neal Glew. From System-F to typed assembly language: http://staff.ustc.edu.cn/~xyfeng/reading/p527-morrisett.pdf
George C. Necula. Proof-carrying code: http://www.utdallas.edu/~kxh060100/Papers/necula96.pdf
Gordon D. Plotkin. LCF considered as a programming language: http://www.sciencedirect.com/science/article/pii/03043975779...
Gordon D. Plotkin. A structural approach to operational semantics: https://www.alice.virginia.edu/~weimer/2006-655/reading/plot...
Guy Lewis Steele Jr. RABBIT: A compiler for SCHEME ftp://publications.ai.mit.edu/ai-publications/pdf/AITR-474.pdf
OK, I think the first two sections are plenty. You can search for the rest yourself on Google Scholar (I found most of these as the first hit on Google Scholar, one or two took looking through a few hits or doing a Google search as well)
[+] [-] seanmcdirmid|12 years ago|reply
[+] [-] mixedbit|12 years ago|reply
'Conventional programming languages are growing ever more enormous, but not stronger. Inherent defects at the most basic level cause them to be both fat and weak: their primitive word-at-a-time style of programming inherited from their common ancestor—the von Neumann computer, their close coupling of semantics to state transitions, their division of programming into a world of expressions and a world of statements, their inability to effectively use powerful combining forms for building new programs from existing ones, and their lack of useful mathematical properties for reasoning about programs.'
Thankfully, 30+ years later these problems are widely fixed ;)
http://www.thocp.net/biographies/papers/backus_turingaward_l...
[+] [-] csense|12 years ago|reply
The majority of the ones in the first two sections were written before I was born; only three were written in the 1990's.
The only one from the 21st century (published in 2000) is in the least-prestigious third section.
Does it take time for good ideas to accumulate the recognition needed to be included in this list? Or was all the low-hanging fruit quickly picked in the early days of the computer age, and only hard problems remain on which little progress has been made?
Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?
Or is language implementation, almost without exception, a series of engineering refinements that happen in a decades-long process after theoretical foundations are first published?
[+] [-] exDM69|12 years ago|reply
Yes, most of the ground breaking theories were researched and documented decades ago. And some of that stuff has yet to make itself to mainstream programming languages (e.g. type inference).
> Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?
Apart from Lisp and Haskell, the "modern" languages you mention are rather boring when it comes to programming language design and there isn't too much research behind them. There has been a lot of practical work and research in garbage collection, run time systems, compiler back ends, just in time compilation and virtual machines. But when it comes to language design, the mainstream programming languages are really conservative.
[+] [-] arethuza|12 years ago|reply
A language like JavaScript might be interesting because it is simple and ubiquitous and embedded in a powerful application platform - but that does not mean it is of much interest from a theoretical CS perspective.
[+] [-] ernesth|12 years ago|reply
Those modern languages are all older than 2000. Some of the papers cited in this list are indeed related to Haskell, Lisp (do you really consider lisp modern?). And Compiling works that will have a lasting impact needs some time to reflect on the importance of new works.
This list is essentially centered on logic in programming languages (lambda calculus, types theory, semantics, monads), hence the absence of python, ruby, javascript: those languages picked a lot from this theory but did not bring much novel material.
[+] [-] zwieback|12 years ago|reply
http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=213...
I found that pretty interesting and it caused a lot of discussion.
[+] [-] Locke1689|12 years ago|reply
I know when I was writing my thesis I was pretty disgusted that he ruined it for all of us mere mortals. ;)
[+] [-] quantumet|12 years ago|reply
"A Symbolic Analysis of Relay and Switching Circuits", 1938, proves that you can use electrical switches to do boolean algebra! He basically invented the digital computer, as a master's thesis.
[+] [-] boothead|12 years ago|reply
http://james-iry.blogspot.com/2009/05/brief-incomplete-and-m...
It's a masterpiece!
[+] [-] tehwalrus|12 years ago|reply
[+] [-] swah|12 years ago|reply
[+] [-] octo_t|12 years ago|reply
[1] http://web.eecs.umich.edu/~bchandra/courses/papers/Igarashi_...
[+] [-] tel|12 years ago|reply
[+] [-] 6ren|12 years ago|reply
[+] [-] trippy_biscuits|12 years ago|reply
http://homepages.inf.ed.ac.uk/gdp/publications/
[+] [-] kenko|12 years ago|reply
[+] [-] adregan|12 years ago|reply
[+] [-] jph|12 years ago|reply
[+] [-] angersock|12 years ago|reply
[+] [-] papsosouid|12 years ago|reply
It is in the top 5 and yet none of the go developers will acknowledge that it exists.
[+] [-] unknown|12 years ago|reply
[deleted]