top | item 31095360

Ask HN: Best way to learn about computing history?

213 points| Tmkly | 3 years ago

I'm a software engineer, mainly working on mobile apps (iOS primarily) through React Native and some Swift/Java. I have a CS degree and about 7 years in this field.

However recently I've become very aware that JS/TS and Swift etc are just APIs on top of APIs. I've been drawn to learning more about how computers work, the history of programming/computers (Unix, Sinclair, commodore, etc and even going back to Ada Lovelace, Babbage and mainframes in the 1950s) and things like memory allocation. I've tried learning some BASIC and Assembly code but haven't really got very far. I read/devour articles on sites like https://twobithistory.org but they only get you so far.

What can I do to help accelerate this and satiate this desire to learn more about how computers work? I live in London, UK and would be happy to spend some money on a uni course or something if there was a good one. I learn best practically so like to be "doing" something as well as theory.

152 comments

order
[+] robotguy|3 years ago|reply
Ben Eater's Youtube series "Building an 8-bit Breadboard Computer" is a really good introduction to the lowest levels of how a computer works:

https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2...

I recommended it to my daughter when she was taking a class in R and asked "But how does the COMPUTER know what to do?"

[+] Banana699|3 years ago|reply
>But how does the COMPUTER know what to do?

Ben Eater is amazing, but his series in my very humble opinion isn't the best answer to this question. I found the emphasis on the breadboard and the particulars of physical implementation getting in the way of a clean pedagogical introduction to logic circuits as a DAG of abstract computational elements implementing function from {0,1}^n -> {0,1}^m (which we then implement with real circuits in whatever medium and form we choose), it's very "DIY" and maker-oriented in nature. This doesn't negate it's status as a masterpiece of educational vlogs, I just feel it leaves a first-time learner hanging on some very important questions.

The single best answer I have ever seen to this question is the outstanding The Elements Of Computing Systems[1], better known as the NandToTetris course[2][3]. You literally start with Nand and build a computer, an assembler, a VM, a compiler, a simple runtime library, and - finally - Tetris running on top of all that. It's one of the best introductions to computers and computer science I have ever seen in my life, at once a textbook and a work of science communication. It comes with it's own software suit[4], and the first 4 chapters of the book (from Nand to a fully functional computer capable of running machine code) are gamified in [5].

[1] https://mitpress.mit.edu/books/elements-computing-systems

[2] https://www.youtube.com/playlist?list=PLrDd_kMiAuNmSb-CKWQqq...

[3] https://www.youtube.com/playlist?list=PLrDd_kMiAuNmllp9vuPqC...

[4] https://www.nand2tetris.org/software

[5] https://nandgame.com/

[+] rschachte|3 years ago|reply
Shout out to Ben Eater. This dude explains how the internet works and rips open an Ethernet cable and hooks up an oscilloscope to it and decodes the bits that transfer over the physical cable.

Very informative!

[+] louwrentius|3 years ago|reply
Offtopic: Reminds me that Ben Eater has gone 100% silent since last november (hope he is ok)
[+] Tmkly|3 years ago|reply
this looks extremely interesting and may be just what I'm after. I have some basic electronics experience so I could build on that too. Thanks!
[+] MichaelMoser123|3 years ago|reply
I am wondering if it is a better idea to teach R as an intro to programming as compared to Python. I mean you dont have this whole indentation business with R...
[+] Phithagoras|3 years ago|reply
From a NAND gate to Tetris was excellent. Informative and obvious

https://www.nand2tetris.org/

[+] naikrovek|3 years ago|reply
I cannot endorse this enough.

some of it is hard and will have you wondering if you want to continue. if you do, and I highly recommend that every developer complete this course, you will find yourself thinking in new ways and understanding many problems very differently, which is a very good thing.

and you will see huge performance problems in almost all software from them on, because none of this (I gesture vaguely at everything everywhere) should be as slow as it is. none of it.

[+] ravenstine|3 years ago|reply
Though it doesn't cover all of computing history, this site is a comprehensive timeline of personal computing history from 1947 to now.

https://kpolsson.com/comphist/

Apparently the author has been maintaining that timeline since 1995 and is still doing it!

While it doesn't cover things like computer science, I think it's an excellent jumping off point for learning about notable people and events.

Not exactly what you asked for, but you may also be interested and may give you some insight I think more programmers should have.

EDIT: Also, don't stop at Babbage & Lovelace. Although Babbage's analytical engine was one of the first, if not the first programmable computers with a form of memory, there were people working on extremely primitive computers (or rather advanced calculators) way before Babbage. Schickard, Pascal, and Leibniz conceived of and developed calculating engines that did basic math with support for interim value storage, which one might consider to be the earliest form of computer memory.

[+] themadturk|3 years ago|reply
Steven Levy's Hackers is a foundational work of the history of computing. Levy spends a lot of time on the MIT hackers of the 1960s and 1970s, the group that hatched Lisp, Richard Stallman and the free software movement, and also a lot of time on the Bay Area hackers that kick-started the microcomputer revolution. Certainly it's not a comprehensive guide to the full range of computing history, but it's an important and engaging look the beginnings of where we are today.
[+] eigenvalue|3 years ago|reply
I second that, Hackers was a great read. I read it back in the late 90s and then again a couple years ago and was surprised how much of it came back to me. He's one of the few tech journalists and writers who actually gets it.
[+] tiahura|3 years ago|reply
One cannot recommend Hackers enough. In many ways, the stories on Hackernews’ front page are the ripples of the events chronicled by Levy.
[+] eigenvalue|3 years ago|reply
I think the best way to learn this stuff is from the people who did it, speaking in their own words. But watching videos takes forever, so the best way to do this is to read oral histories. The Computer History Museum has really great content-- I've read dozens of these. You can easily find them, ranking in approximate order of popularity, with the following google search:

https://www.google.com/search?q=oral+history+computer+museum...

To find more (and there are many great ones outside of the Museum), you can try a broader search:

https://www.google.com/search?q=oral+history+arpa+filetype%3...

I have found that I can read around 3-5x faster than listening to people talk, depending on the speed of the speaker (most of the people interviewed in these oral histories are quite old and can speak a bit slower), and I also retain the information much better. There is something about reading an actual conversation by someone who was there when this stuff was being invented (or literally invented it themselves) that you don't get from reading a retrospective historical account, and it makes the information stick with you more, since it's all framed in stories and personal accounts.

Some favorites:

https://conservancy.umn.edu/bitstream/handle/11299/107503/oh...

https://archive.computerhistory.org/resources/access/text/20...

https://conservancy.umn.edu/bitstream/handle/11299/107247/oh...

https://archive.computerhistory.org/resources/text/Oral_Hist...

http://archive.computerhistory.org/resources/text/Oral_Histo...

https://conservancy.umn.edu/bitstream/handle/11299/107613/oh...

https://digitalassets.lib.berkeley.edu/roho/ucb/text/valenti...

https://conservancy.umn.edu/bitstream/handle/11299/107642/oh...

There are so many other good ones, but that's a good start!

[+] RugnirViking|3 years ago|reply
There are also several good museums dedicated to the subject. I used to work for the national museum of computing at bletchley park in the UK, and there they have a lot of good exhibits that teach basics of how computers and networking works and has evolved over the years.

Another good approach one can take to learn is starting with a simple system with well-defined rules, and making a simple computer out of it. Many people do this in minecraft, for myself it was boolean functions in excel. You can and should look many things up during this process, fail and rework designs several times etc. Learning how logic gates work, then scaling the knowledge up to bit adders, registers, ALU, making a cpu instruction set and starting on basic turing machine architecture is a very rewarding hobby and is definately the best way to get low-level knowledge

[+] bin_bash|3 years ago|reply
I second Bletchley. Consider it your obligatory pilgrimage as a computing professional.
[+] Tmkly|3 years ago|reply
I've been meaning to visit Bletchley! thanks, these suggestions are helpful
[+] Jtsummers|3 years ago|reply
So your title and comment suggest two slightly different things. For "how computers work?" I recommend Code by Petzold (higher level, good book) and The Elements of Computing Systems by Nisan and Schocken (also available here: https://www.nand2tetris.org/). The latter is project based and has you develop a computer starting at NAND gates and working up. It can be run through at a good clip while still learning a lot if you're a moderately experienced developer.

EDIT: Per Amazon there's a second edition of Code coming out at some point, but no date that I've been able to find.

I've also got a copy of, but not yet read, Ideas That Created the Future: Classic Papers of Computer Science edited by Harry R. Lewis, the contents are in chronological order with the most recent in 1979. It has 46 different papers on computing, being largely historical this ought to be a decent starting point as well.

[+] leohonexus|3 years ago|reply
I’d also add the book “But How Do It Know?” from J. Clark Scott as a fantastic primer, building from gates to RAM and CPU, to a simple bootloader and assembly programming. It comes with a CPU simulator on the book’s website so you could make sense of what you’re learning - and being a light read you could reasonably finish it in a week.

http://www.buthowdoitknow.com

[+] Tmkly|3 years ago|reply
yeah I'm not entirely sure what I want. Thanks for these suggestions, will take a look. nand2tetris looks cool!
[+] ecliptik|3 years ago|reply
I haven’t read it, but heard good things about “The Soul of a New Machine” about Data Generals efforts create a new 32-bit superminicomputer.

https://www.tracykidder.com/the-soul-of-a-new-machine.html

Another comment mentioned “Pirates of Silicon Valley” as a good dramatization of MS/Apple and there’s also the miniseries “Valley of the Boom” about the rise and fall of Netscape and “Halt and Catch Fire” which is a fictional and thematic view of 80s/90s computer history.

[+] wantoncl|3 years ago|reply
Soul of a New Machine is absolutely classic. You'll learn just how much, and how little, computers and programming have changed since the 70s. And some interesting takes on microcoding, if you're already dealing with APIs on APIs, you'll get a better understanding.

If you want some Apple history, particularly on the early days of Macintosh, check out folklore.org. DO NOT start reading it if you have anything important to do for the next 24 hours.

[+] Anon84|3 years ago|reply
Feynman’s Lectures on Computation: https://www.amazon.com/gp/product/B07FJ6RRK7/ref=as_li_tl?ie...

You might be familiar with Feynman's Lectures on Physics, but his lectures on Computation (based on a class he taught and his work in 'Connection Machine') aren't any less amazing. Through this short book, Feynman guides us through the concept of computation and the van Neumann architecture in his unique style, from logic functions, to Turing machines, coding and even quantum computers. It will give you a unique appreciation of the finer points in which computers are "Dumb as hell but go like mad" so that you can better squeeze every bit of performance out of your code.

[+] Jach|3 years ago|reply
For "how things work", I recommend the book Code by Charles Petzold. After that, Jon Stokes's Inside the Machine will give a lot of details on CPU architectures up to Intel's Core 2 Duo. You can also try following along a computer engineering book if you want to go that low in detail with exercises, Digital Fundamentals by Floyd is a common textbook (I have an old 8th edition).

History-wise, enjoy learning slowly because there's so much that even if you dedicated yourself to it you wouldn't be "done" any time soon! Some suggestions in order though:

Watching The Mother of All Demos: https://www.youtube.com/watch?v=yJDv-zdhzMY

A short clip of Sketchpad presented by Alan Kay: https://www.youtube.com/watch?v=495nCzxM9PI

An article from the 40s that also inspired Engelbart: https://www.theatlantic.com/magazine/archive/1945/07/as-we-m...

The Information by James Gleick

What the Dormouse Said by John Markoff

The Psychology of Computer Programming by Gerald Weinberg

Lastly, to mix up in whatever order you please, some paper collections:

Object-Oriented Programming: The CLOS Perspective edited by Andreas Paepcke

History of Programming Languages papers for various langs you're interested in, here's the set from the second conference in 1993 https://dl.acm.org/doi/proceedings/10.1145/154766 but there have been further conferences to check out too if it's interesting

Also all of the Turing Award winners' lectures I've read have been good https://amturing.acm.org/lectures.cfm

All that and some good recommendations others have given should keep you busy for a while!

[+] netsharc|3 years ago|reply
A long while ago I found the Jargon File, it's a "dictionary" of terms used by hackers as the culture was budding at the universities in the 70's. Reading the entries you get a glimpse of the technology and culture of those places at that time. Young me found it really cool in a nerdy way, and read all the entries from front to back. Since this was before the always online times, I was just reading the TXT file from http://jargon-file.org/archive/ rather than needing to navigate the many pages: http://www.catb.org/~esr/jargon/
[+] Tmkly|3 years ago|reply
that's great, will take a look. thanks
[+] chillpenguin|3 years ago|reply
In terms of computing history, The Dream Machine by Mitchell Waldrop is incredibly good.

In terms of "how computers work" I agree with others who recommended Elements of Computing Systems (aka nand2tetris).

[+] quirino|3 years ago|reply
The Dream Machine is easily one of the best-written books I've ever read. Really gives a good overview of how many people are involved in various ways in the evolution of computing.
[+] khaledh|3 years ago|reply
I have the same passion about computing history. I can't count the amount of literature I've read to learn about this fascinating history; it's very satisfying to know when, how, where, and by who original work was done to advance computing. Most of the foundational work in computer architecture and computer science was done in the 50s, 60s, and 70s. From there it has been incremental improvements.

I highly recommend reading "The Dream Machine" by Mitchell Waldrop. It's very well written, and covers a huge swath of computing history, from the ENIAC to the Internet (it was written in 2000).

Instead of recommending specific sources (too many), I can mention key milestones in computing history that you may want to research:

- Theory of computation (Alan Turing, Alonzo Church)

- Early binary systems (John Atanasoff, Konrad Zuse, George Stibitz, Claude Shannon)

- Early computers (ABC, ENIAC, EDSAC, EDVAC, Von Neumann architecture)

- Early programming (Assembly language, David Wheeler, Nathaniel Rochester)

- Early interactive computing (MIT Whirlwind, SAGE, TX-0, TX-2)

- Early mainframes (UNIVAC, IBM 70x series)

- Early programming languages (Speedcoding, Autocode, A-0, A-2, MATH-MATIC, FLOW-MATIC)

- First programming languages (FORTRAN, COBOL, LISP, ALGOL)

- Early operating systems (GM-NAA I/O, BESYS, SOS, IBSYS, FMS)

- Early time-sharing system (MIT CTSS, Multics, DTSS, Berkeley TSS, IBM CP-67)

- Early Virtual Memory (Atlas, Burroughs MCP)

- Early minicomputers (DEC PDP line)

- Mainframe operating systems (IBM OS/360, UNIVAC EXEC)

- Early online transaction processing (SABRE, IBM ACP/TPF)

- Early work on concurrency (Edsger Dijkstra, C.A.R. Hoare, Per Birch Hansen)

- Early database systems (GE IDS, IBM IMS, CODASYL)

- Early Object-Oriented Programming (Simula I, Simula 67, Smalltalk)

- More programming languages (CPL, BCPL, B, C, BASIC, PL/I)

- Mini/Supermini operating systems (Tenex, TOPS-20, VMS)

- Structured Programming (Pascal, Modula, Niklaus Wirth)

- Relational data model and SQL (Codd, Chamberlin, Boyce)

I could keep going on, but this is already too long. I hope this at least puts your feet on the first steps.

[+] criddell|3 years ago|reply
I’ll second your recommendation for The Dream Machine. Unless your vision is very good, I’d recommend getting it as an ebook. The book from Stripe Press is beautiful, but the text is pretty tiny.
[+] listenfaster|3 years ago|reply
Good: split your time between activities and reading something as satisfying as the things you “devour”. To that end, I would plus one Hackers (Levy) and Code (Petzold). Also, the Cathedral and the Bazaar by esr

http://www.catb.org/~esr/writings/cathedral-bazaar/

and other things from esr at

http://www.catb.org/~esr/

including the aforementioned jargon file. Here’s one I hadn’t stumbled on before, ‘Things Every Hacker Once Knew’

http://www.catb.org/~esr/faqs/things-every-hacker-once-knew/

For an activity ymmv depending on how much time you can spend; an alternative to building a computer from scratch, or an OS from scratch, is to buy a vintage cheapie running cp/m or dos, something where the OS isn’t abstracting memory management for you. Growing up in the 80s, I think managing my own memory and _everything_ that implies was the greatest teacher.

[+] AdamH12113|3 years ago|reply
Having gotten into computers in the early 1990s, I knew a lot of "Things Every Hacker Once Knew", but I did find something exciting that I didn't know in the discussion of ASCII control characters:

>ETB (End of Transmission Block) = Ctrl-W >Nowadays this is usually "kill window" on a web browser, but it used to mean "delete previous word" in some contexts and sometimes still does.

I tried Ctrl-W in a Linux console and it works! This will save me some trouble in the future.

[+] shadowofneptune|3 years ago|reply
Though the cathedral/bazaar terminology is influential, I am not sure reading the original text helps understand open source as it actually is now. It concludes that open source would drive out closed source software, when what we see is open libraries being much more popular than open applications. This is in part due to Raymond's own work at the Open Source Initiative. I'd probably be better for someone now to read a retrospective rather than a treatise.
[+] Tmkly|3 years ago|reply
these resources look great, thanks. also yeah great suggestion for buying a cheap vintange microcomputer!
[+] als0|3 years ago|reply
If you can manage a day trip to Cambridge (about an hour from London), you should visit the excellent Museum of Computing History http://www.computinghistory.org.uk/
[+] cameronh90|3 years ago|reply
National Museum of Computing at Bletchley Park is also excellent.
[+] bingaling|3 years ago|reply
The 1992 WGBH/BBC 5-part miniseries "The Machine That Changed The World"(US)/"The Dream Machine"(UK):

https://en.wikipedia.org/wiki/The_Machine_That_Changed_the_W...

is out of print, but can be found intermittently on youtube.

I love the coverage of 1940's computing, with interviews with several of the surviving people:

https://en.wikipedia.org/wiki/Konrad_Zuse

https://en.wikipedia.org/wiki/ENIAC

https://en.wikipedia.org/wiki/Eckert%E2%80%93Mauchly_Compute...

https://en.wikipedia.org/wiki/EDSAC

Currently working episode links:

1: https://www.youtube.com/watch?v=hayi9AsDXDo

2: https://www.youtube.com/watch?v=GropWVbj9wA

3: https://www.youtube.com/watch?v=rTLgAI3G_rs

4: https://www.youtube.com/watch?v=E1zbCU5JnE0

5: https://www.youtube.com/watch?v=vuxYUJv2Jd4

[+] spogbiper|3 years ago|reply
The Advent of Computing podcast may be of interest. The host really strives to find accurate historical information about a variety of early computing topics.

https://adventofcomputing.com/

It's also fairly entertaining

[+] digisign|3 years ago|reply
Was just showing the subject to a youngster recently. Other folks mentioned the Code book, I liked that one. The MMM by Brooks of course. We also looked at the following videos on youtube/Kanopy and other places:

- The Story of Math(s) by Marcus du Sautoy to set the stage... school and taxes in ancient Sumeria, Fibonacci bringing Indian numbers to Europe, and other fascinating subjects.

- We watched short biographies of Babbage and Lovelace, full-length ones of Turing and Von Neumann. The "code breakers" of WWII.

- Top Secret Rosies: The Female "Computers" of WWII, another good one.

- There's more history in PBS' Crash Course Computer science, than you might expect. It is great although so peppy we had to watch at .9x with newpipe. Shows relays, vacuum tubes, to ICs, to the Raspberry Pi. As well as the logic gates they model.

- "The Professor" at Computerphile is a great story teller about the early days.

- There are great videos about CTSS being developed at MIT I think, where they are designing an operating system via paper terminal and trying to decide on how to partition the memory/storage: https://www.youtube.com/watch?v=Q07PhW5sCEk

- The Introducing Unix videos by ATT are straight from the source: https://www.youtube.com/watch?v=tc4ROCJYbm0

- The movie/book "Hidden Figures" touches on this time as well. Facing obsolescence by IBM, one of the characters teaches herself Fortran.

- The Pirates of Silicon Valley is a fun dramatization of the late 70s to 80s PC industry. It specifically calls out the meeting between MS and IBM as the deal of the century. We also watched a "Berkeley in '68" doc on Kanopy to set the stage before this one. Interesting, but a tangent.

- The "8-bit Guy" is also great, he dissects and rebuilds old home computer hardware from the same era, and teaches their history as he does it. Even his tangential videos on why there are no more electronics stores (besides Apple) in malls is great.

- There are good docs on the "dead ends" of the industry as well, such as "General Magic" and "Silicon Cowboys."

- "Revolution OS" a doc about the beginnings of FLOSS and Linux.