top | item 22346349

How These Things Work – A book about CS from first principles (2016)

688 points| falava | 6 years ago |reasonablypolymorphic.com | reply

57 comments

order
[+] Toenex|6 years ago|reply
Nice. I did an EE degree many years ago even though I wanted to work in software. The single best course I did was to design, simulate and have fabricated a 4 bit microprocessor. It completely solidified my understanding of how computations took place. Hopefully, texts like this will do the same for others.
[+] JoeSmithson|6 years ago|reply
In a similar vein, I highly recommend the MOOC "nand2tetris" which progresses smoothly from simple logic circuits right up to a high level Java-like programming language.
[+] ci5er|6 years ago|reply
I did a EE CS double major a LONG time ago. Senior year included a real summonavitch VLSI course, where in Q1 we had to design a 16- or 32- bit CPU. Over Thanksgiving break, TI manufactured them for us. Q2, we built a computer. Q3, we made it bootable (OS) and C compiler. I was interviewing for jobs in Japan (school was in Terre Haute, so travel overhead!) and I thought I was going to die. But it was the best thing that ever happened to me...
[+] thepete2|6 years ago|reply
By "fabricate" do you mean that your design was produced in silicon? If so, that's super cool!
[+] ceronman|6 years ago|reply
I love this kind of Books! In a much more general tone, there is "The Knowledge: How To Rebuild Our World After An Apocalypse" by Lewis Dartnell. This explains the first principles of a lot of things we take for granted in the modern world, from agriculture to food and clothing, medicine, chemistry and more.

This book about CS principles is a great complement to that!

[+] kragen|6 years ago|reply
As I posted the last time this book came up https://news.ycombinator.com/item?id=22294655 that book would be more accurately called The Misinformation. If you follow the instructions in it you will die. Better alternatives are listed in my linked comment.
[+] 7thaccount|6 years ago|reply
I've been looking for a book like that lol.
[+] matco11|6 years ago|reply
One of my favorite books on subject is The Elements of Computing Systems: Building a Modern Computer from First Principles by Noam Nisan https://www.amazon.com/Elements-Computing-Systems-Building-P...
[+] dbish|6 years ago|reply
Agreed. I read through this on a recent vacation and even though i have a computer engineering degree, it was nice to get through the basics again and they have some fun exercises.
[+] djangovm|6 years ago|reply
Would you say that this book is still relevant, now that it is 15 years old?
[+] grumple|6 years ago|reply
I recommend Code: The Hidden Language of Computer Hardware and Software by Charles Petzold [1]. It is far more comprehensive than the OP, goes from pre-computer code, to electrical circuits, to an overview of assembly. No prior knowledge needed except how to read.

1. https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

[+] look_lookatme|6 years ago|reply
The amazing thing about Code is how it traces the connection of formal logic (in the Aristotelian sense) to the, as you say, pre-computer code of braille and even flag signals to form the foundations of modern computing.

I am a self-taught developer and probably had 10 years experience in web development when I first read Code. I would have these little moments of revelation where my mind would get ahead of the narrative of the text because I was working backwards from my higher level understanding to Petzolds lower level descriptions. I think of this book fairly often when reading technical documentation or articles.

I recently listened to Jim Keller relate engineering and design to following recipes in cooking [1]. Most people just execute stacks of recipes in their day-to-day life and they can be very good at that and the results of what they make can be very good. But to be an expert at cooking you need to achieve a deeper understanding of what is food and how food works (say, on a physics or thermodynamic level). I am very much a programming recipe executor but reading Code I got to touch some elements of expertise, which was rewarding.

https://youtu.be/Nb2tebYAaOA?t=1351

[+] kragen|6 years ago|reply
Code's good but it doesn't cover Kleisli categories and Kleisli composition, Peano arithmetic, parametric polymorphism, sum types, pattern matching, or any of numerous other things covered in Maguire's How These Things Work. So it's not accurate to say Code is "far more comprehensive"; Code mentions FORTRAN, ALGOL, COBOL, PL/1, and BASIC, but the example programs it contains are written in assembly, ALGOL, and BASIC. It doesn't contain any programs you can actually run except for an three-line BASIC program and some even simpler assembly programs.
[+] criddell|6 years ago|reply
I'm reading Code right now and it's fantastic. I'm a bit more than 1/2 the way through and so far it's only been about how computers work and not really about computer science.

I heard an expression this weekend that I think is apt - a computer is to computer science as a telescope is to astronomy.

[+] abnry|6 years ago|reply
Code is what got me as a teenager interested in tech. It is an awesome book.
[+] throw1234651234|6 years ago|reply
This is part of what allowed me to get into programming. The "no prior" knowledge part is absolutely true.

I did start getting lost around the second half of the book.

[+] oblosys|6 years ago|reply
For a do-it-yourself version of chapter 1 (about building complex circuits using only nands), I can recommend http://nandgame.com/
[+] porknubbins|6 years ago|reply
A few years ago, I asked my engineer friend about how much of civilization he could rebuild singlehandedly, should he survive some hypothetical apocalyptic event. “All of it,” he replied. “Not all at once, but I know enough to be able to puzzle together the pieces I don’t know right this second.”

While I admire the Connecticut-Yankee optimism of the engineer, as a non engineer I am seriously skeptical about how a single engineer could know enough about the chemistry, materials, physics, CS etc. I can explain what a battery, or transistor is supposed to do but wouldn't have the foggiest idea how to actually make one. In this scenario are we leaving the bunker to break into Bell Labs (or some research university library at least)?

[+] augbog|6 years ago|reply
Somewhat of a tangent but related, there is an anime called Dr.Stone where a brilliant genius scientist kid gets ported 3700 years in the future and people have reverted back to the stone ages. He teaches them how to build everything from scratch and makes some crazy stuff i.e. antibiotics, etc. Highly recommend
[+] ClumsyPilot|6 years ago|reply
The questions usually lack a starting point. It's one thing to be stuck I medieval time, but stone age is another thing entirely.
[+] latexr|6 years ago|reply
I share your skepticism. Seems to me the engineer is falling prey to the Dunning-Kruger effect[1]. Rather than knowing enough to be able to puzzle together all the pieces they don’t know, I’d wager they don’t know enough to be able to discern what they won’t be able to figure out.

[1]: https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect

[+] hcarvalhoalves|6 years ago|reply
Half-way into it’s introducing monads and Maybe. Feels like teaching a stack machine after talking about the visitor pattern. There’s good information here but I’m not sure it covers the important fundamentals (such that I could give to a beginner).
[+] Anon84|6 years ago|reply
This looks really cool, thank you.

Another book I particularly like in the same style are Feynman (lesser known) lectures on computation: https://amzn.to/2SSoJaR where he takes you from single instructions all the way to quantum computing

[+] kragen|6 years ago|reply
Neat, and it looks like it's under a 3-clause BSD license, too: https://github.com/isovector/reasonablypolymorphic.com/blob/...

And it's tackling pretty advanced material — a bunch of category-theory stuff that I have no idea about. This is exciting!

It looks like maybe it's unfinished: https://reasonablypolymorphic.com/book/tying-it-all-together... ends, "Really, we’re just getting started," and then (the current version of) the book ends. What a cliffhanger ending!

It doesn't seem to yet cover circuitry; the hardware it discusses seems to be a two-tape Turing machine, much like BF. The author seems to have been simulating the machine by hand to generate the included execution traces.

I had a hard time finding the source at first: https://github.com/isovector/reasonablypolymorphic.com/blob/... has a bunch of attribute-embedded &-escaped SVG (including XMLPIs!) that he almost certainly didn't type like that. That file is duplicated at https://github.com/isovector/reasonablypolymorphic.com/blob/... in the same format.

As it turns out, the source for that post is in https://github.com/isovector/reasonablypolymorphic.com/blob/..., with embedded Haskell to produce the SVGs. The build scripts looks like they might be in https://github.com/isovector/reasonablypolymorphic.com/tree/... and https://github.com/isovector/reasonablypolymorphic.com/tree/... but I can't tell where the code for generating SVG comes from. ("stack install" maybe? But then is it datetime, sitepipe, or strptime?) So I can't figure out how to fix the text in the SVGs to not crash into the diagram lines.

Careful about cloning the repo. It's a quarter gig!

[+] javajosh|6 years ago|reply
The author states that computer science has little to do with computers on the first page so I would not expect circuitry.
[+] koonsolo|6 years ago|reply
I never heard of "first principles" until Elon Musk used it. (Not a native English speaker) Now I see it everywhere.

Must be the new overhyped term. "We start from first principles, just like Elon Musk".

After looking at Google trends, I might be wrong, so nevermind ;) https://trends.google.com/trends/explore?date=all&q=First%20...

[+] wenc|6 years ago|reply
First-principles is a physics/math way of thinking, and is common parlance in the mathematical modeling world.

When we say a model is a first-principles model, it means it is derived through fundamental equations like conservation of mass/energy, and other known relationships. This is in contrast to a data-driven model, where the underlying phenomena are not explicitly modeled -- instead the model is created by fitting to data.

Elon Musk became associated with it because he applied this form of thinking to business problems, i.e. by establishing the "fundamental equations" (as it were), questioning some basic assumptions and coming up with conclusions that are necessarily true but that no one else has arrived at.

Data-driven models (or the human equivalent: reasoning by analogy) are convenient to build and work well in the space the data has been collected in (~interpolation). However, they do not extrapolate well -- you cannot be sure they will work outside of the space of training data that the model has seen.

First-principles models (or the human equivalent: reasoning by principles) are generally more difficult to build and test (I worked on first-principles physics models for a decade -- they are a pain), but because they are built on a structure of chained principles/truths, they often extrapolate well even to areas where data has not been collected.

This is why if you want to improve efficiency and operations in known spaces, you use data-driven models (fast to build and deploy, accurately captures known behavior).

But for doing design and discovery (doing new things that have never been done before), first-principles models/thinking will carry you much farther.

[+] siddboots|6 years ago|reply
It's a relatively common idiom here in Australia. It gets used quite a bit in STEM education, i.e. "prove such-and-such from first principles", but is also pretty common more colloquially.

According to your link, it is also used a bit in South Africa (where Elon grew up), but less common in the US. Rather than being a new and overhyped term, perhaps it is a case of Elon using a term that is quite everyday to him, without realizing it is less familiar to the audience.

[+] chestervonwinch|6 years ago|reply
It often occurs that technical jargon is swiped to be used as business speak by business leaders with technical backgrounds. For example I often hear “orthogonal” when “independent” is usually more appropriate.
[+] empath75|6 years ago|reply
Aristotle used the term, and it’s probably older than him.
[+] marcosdumay|6 years ago|reply
It's all over the place in physics and chemistry, and as a consequence on the engineering areas that are based on those (AKA, nearly all of them).

It is rarer to see it in CS, but it's more because CS used to deal with very simple theories up to recently than because of some fashion. As CS theories start to construct up from the earlier ones, it's appearing more.

[+] KineticLensman|6 years ago|reply
UK here - I remember it back in college thirty+ years ago as in "build X from first principles", which could mean (e.g.) implement an algorithm without out using library calls.
[+] itronitron|6 years ago|reply
It's a favorite term used by people that identify as 'thought leaders'