K. Cooper, L. Torczon "Engineering a Compiler", 2nd ed. (2011)
A. W. Appel, M. Ginsburg "Modern Compiler Implementation in Java" (2007)
S. Muchnick "Advanced Compiler Design and Implementation" (1997)
The last book focuses entirely on program analysis and optimization. It even has a chapter on optimization for the memory hierarchy! Of course, since it's a rather dated book, the specific details in that chapter are mostly useless today, but the rest is solid.
However, those books mostly cover imperative languages (although Appel & Ginsburg devote a chapter on functional languages, both of strict and lazy variety and discuss some optimization challenges), so if you want to learn about implementing functional languages...
S. L. Peyton Jones et al. "The Implementation of Functional Programming Languages" (1987)
A. W. Appel "Compiling with Continuations" (1992)
Holy fatcats, those are some old books! But sadly, I am not aware of more modern ones. Try searching for papers on the topics that interest you in particular, I guess (for example, there are several papers that discuss appropriateness of using CPS as IL: "Compiling with Continuations, Continued", "Compiling without Continuations", and "Compiling with Continuations, or without? Whatever". Yes, the puns seem to be the noble tradition in the PL implementation circles). The first book starts as a general introduction but starting at about the middle firmly steers into implementing a lazy languages. The second is pretty much a description of how SML/NJ compiler was made, based on its state in about 1991, of course, and has some interesting benchmarks on efficiency of different implementations of closures.
Plus, there are lot of random pages on the web with resources for various compiler implementation courses from different universities. For one arbitrary example, https://course.ccs.neu.edu/cs4410/
If your language is fairly low-level, you might be able to get away with "translating" it into LLVM.
If it's not low-level, you have to make a bunch of implementation decisions - how do your language's killer features work under the hood??
For just one example, for V, the author reuses Go's `go` keyword to launch a new coroutine/green thread. What algorithms do you use to share work between threads equally? What data structures do you use to represent those? What's the right balance between latency, throughput, memory usage, etc etc?
Many common language features are non-trivial to implement, even with the help of the LLVM IR (which I think is wonderful).
It should be noted though that Dragon is very detailed especially on parsing. Thus some consider it a bad introductory book and/or even outdated. But it's good resource if you want to become better and parsing is just a part of the book.
Joker_vD|5 years ago
A. W. Appel, M. Ginsburg "Modern Compiler Implementation in Java" (2007)
S. Muchnick "Advanced Compiler Design and Implementation" (1997)
The last book focuses entirely on program analysis and optimization. It even has a chapter on optimization for the memory hierarchy! Of course, since it's a rather dated book, the specific details in that chapter are mostly useless today, but the rest is solid.
However, those books mostly cover imperative languages (although Appel & Ginsburg devote a chapter on functional languages, both of strict and lazy variety and discuss some optimization challenges), so if you want to learn about implementing functional languages...
S. L. Peyton Jones et al. "The Implementation of Functional Programming Languages" (1987)
A. W. Appel "Compiling with Continuations" (1992)
Holy fatcats, those are some old books! But sadly, I am not aware of more modern ones. Try searching for papers on the topics that interest you in particular, I guess (for example, there are several papers that discuss appropriateness of using CPS as IL: "Compiling with Continuations, Continued", "Compiling without Continuations", and "Compiling with Continuations, or without? Whatever". Yes, the puns seem to be the noble tradition in the PL implementation circles). The first book starts as a general introduction but starting at about the middle firmly steers into implementing a lazy languages. The second is pretty much a description of how SML/NJ compiler was made, based on its state in about 1991, of course, and has some interesting benchmarks on efficiency of different implementations of closures.
Plus, there are lot of random pages on the web with resources for various compiler implementation courses from different universities. For one arbitrary example, https://course.ccs.neu.edu/cs4410/
brianush1|5 years ago
ActorNightly|5 years ago
fwip|5 years ago
If it's not low-level, you have to make a bunch of implementation decisions - how do your language's killer features work under the hood??
For just one example, for V, the author reuses Go's `go` keyword to launch a new coroutine/green thread. What algorithms do you use to share work between threads equally? What data structures do you use to represent those? What's the right balance between latency, throughput, memory usage, etc etc?
Many common language features are non-trivial to implement, even with the help of the LLVM IR (which I think is wonderful).
andoriyu|5 years ago
forgotpwd16|5 years ago
mbrodersen|5 years ago