top | item 19027254

The Man Who Invented Information Theory (2017)

380 points| huihuiilly | 7 years ago |bostonreview.net

100 comments

order
[+] Cybiote|7 years ago|reply
Claude Shannon's Masters Thesis was on the application of Boolean Algebra to circuits, effectively founding digital circuit design. That should have been enough for anyone, but not for Shannon. His later work on Information theory has proven important in everything from evolution to quantum mechanics (particularly, relative quantum entropy) and perhaps even to future physics (https://www.youtube.com/watch?v=td1fz5NLjQs).

He also did early work in cryptanalysis, AI (minimax chess algorithm and a learning robotic mouse). His student was Ivan Sutherland. More or less, whether its networking, signal processing, compression, crypto, machine learning, circuit design, basically anything to do with the digital age, you'll find Shannon did important foundational work there.

[+] tnecniv|7 years ago|reply
If I recall, Turing actually started to formalize some ideas in information theory, but stopped after meeting Shannon. Shannon showed Turing his information theory work and Turing decided Shannon already solved the question he was interested in.
[+] porpoisely|7 years ago|reply
It is a shame shannon doesn't get the recognition he deserves. Computer science has many fathers and as amazing as Turing was, nobody contributed more and nobody has a better claim to call himself the father of computer science than Shannon. Not Turing. Not Church. He outstrips them all but nobody outside of the computer science world has heard of him.
[+] jiggunjer|7 years ago|reply
And he was handsome too! Some guys have it all.
[+] lixtra|7 years ago|reply
"A Mathematical Theory of Communication" is a beautiful read. You easily find the PDF but once the latex source was available as well - in case you want to reformat for e-book reader: https://web.archive.org/web/20130129025547/http://cm.bell-la...
[+] crdrost|7 years ago|reply
I cannot overstate this point and upvote it enough.

The reason Claude Shannon is a legend has a lot to do with the fact that his ideas are not just correct and draw from his multidisciplinary knowledge, but also expertly communicated. It is an incredibly readable paper and only absolutely requires a little bit of mathematics when he is describing how to convert a state machine for Morse code to a matrix which would allow someone to calculate how many bits per time unit can actually be transmitted in Morse code -- and this is not absolutely essential, it is calculated in a different way and also it is communicated that you can easily take his word for it that it is whatever numerical value it was. A lot of the arguments about fitting a stream of symbols into an encoding for noisy channels have these lovely diagrams that help to elucidate exactly what he's talking about.

If you want to make that same sort of impact, it is not just important to be a great mind, but to spend a bunch of time practicing how you communicate that information.

[+] dbcurtis|7 years ago|reply
"The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning..."

(emphasis in the original)

I find that second sentence humorous every time I read it... I wonder if he had seen the current state of internet discussion if he would have changed "frequently" to "occasionally".

[+] Zanneth|7 years ago|reply
The ingenuity and brilliance coming from Bell Labs during that era is absolutely astonishing. Transistors, information theory, satellite communication, UNIX/C, the list goes on. These ideas unquestionably laid the foundation for modern high-tech society.

If anyone is interested in learning more about Bell Labs and the folks who worked there, “The Idea Factory” by Jon Gertner is a fantastic book written on the subject. It’s not comprehensive but it’s a very inspiring read.

There has always been something very vexing about Bell Labs’ legacy though. They had everything they needed to start the personal computing revolution: engineers, scientists, equipment, a nationwide telephone network for god’s sake. What happened?

[+] cfallin|7 years ago|reply
I heartily second the recommendation for "The Idea Factory"! I'm currently reading this book and aside from the seriously impressive run of successes, the characters are really quite amusing at times too. E.g., Shannon built a desktop calculator that operated using Roman numerals only ("THROBAC") in order to amuse himself. And some of the whimsical creations were pretty impressive in their own right. E.g., his maze-solving mouse "Theseus" learned the maze layout on progressive runs through the maze by using relay-based logic.
[+] gsmecher|7 years ago|reply
Bell Labs continued doing foundational research unfettered by marketability for a long time after the conventional story has them buried and gone. (To pick a personal favourite, B. F. Logan's work on Click Modulation is a bizarro-world stumble through arcane mathematics that pops out somewhere unexpected and valuable in the field of signal processing.)

Unfortunately, a lot of this research was buried in microfiche in university basements as Bell Labs' legacy was traded from company to company without a viable distribution mechanism. As a result, although this work is now all available on-line (sadly not in open-access form), there are still more forgotten gems in there than many researchers realize.

[+] gowld|7 years ago|reply
They had a monopoly and an innovators dilemma.

Even though that brought the lab's demise, its then-younger staff and their mentees continued their work at other organizations, including many at Google.

[+] goatlover|7 years ago|reply
Xerox Parc had early personal computers and developed GUI/mouse driven software with email, word processor and paint programs and an IDE meant for kids to be able to learn and use. But it was some college dropouts in their garages that brought it to the masses.
[+] dmux|7 years ago|reply
One of the more surprising things I remember them researching (in their earlier years) was different wood to be used for telephone poles. IIRC they ran experiments for several years to see what type of wood lasted the longest in different weather conditions. All this to drive down costs in the long term.
[+] davedx|7 years ago|reply
Same thing that happened with Xerox...
[+] adolph|7 years ago|reply
A new Shannon biography, A Mind at Play: How Claude Shannon Invented the Information Age, may help reverse this legacy. Authors Jimmy Soni and Rob Goodman make a strong bid to expose Shannon’s work to a popular audience, balancing a chronological narrative, the “Eureka!” moments that sprang from his disciplined approach to solving puzzles, and his propensity for playfulness.

Another book with a lengthy section on Claude Shannon is James Gleick's The Information: A History, a Theory, a Flood [1.]. A shorter but still nice explanation in on Brain Pickings [2.].

1. https://www.goodreads.com/book/show/8701960-the-information 2. https://www.brainpickings.org/2016/09/06/james-gleick-the-in...

[+] hinkley|7 years ago|reply
Is he really unknown? I know most of us don’t do much signal processing these days but you can’t get far in studying compression without hearing about Shannon.
[+] udfalkso|7 years ago|reply
The Information is an incredible book. I highly recommend.
[+] ThomDenholm|7 years ago|reply
A Mind At Play is an excellent biography of a fascinating mathematician and visionary thinker. Ranges from early life through final years and also does a decent job of introducing his Mathematical Theory of Communication, the basis of modern information theory. A solid 4½ stars.
[+] Isamu|7 years ago|reply
Shannon was a speaker at my commencement at CMU. He struck me as kind of a self-effacing, amusing sort of character. Maybe like a much quieter, less ego-driven version of Feynman.

Also: as much as people talk about information theory in various contexts, I doubt that many take the trouble to understand it better. Like thermodynamics, people want to take away an overly broad, folksy interpretation and apply it everywhere without stopping to think if it really applies in the way they claim.

[+] shaklee3|7 years ago|reply
I'm not sure what you mean here, so it may be worth elaborating. I think a lot of people understand how theory, and certainly in communication theory everyone understands the famous equation. I don't see anyone claiming it applies in places it doesn't.
[+] JoeAltmaier|7 years ago|reply
Ok his fundamental theorems aren't strictly binary. He wrote of discerning possible 'symbols' from noisy measurement where a symbol could have N possible values. The optimum discernment was at 2 values (1 and 0) but his math handled any base.

If I remember right after 20 years.

[+] an-allen|7 years ago|reply
"Anything can be communicated over a noisy channel without error" - Was something my Elec 201 professor would confidently proclaim before he would go into a lengthy aside about how amazing Claude Shannon is.

That was nearly 20 years ago.

Everyday I appreciate that statement, and Claude Shannon, a little bit more.

[+] crb002|7 years ago|reply
He also showed the limits of the efficient market hypothesis in economics. Add in Einstein (information cannot travel faster than the speed of light), and Kolmogorov (random information cannot be compressed) to get 90% of why markets cannot be fully efficient.
[+] argonium|7 years ago|reply
There's a fantastic book on Shannon called "Grammatical Man", by Jeremy Campbell - https://www.amazon.com/dp/B000KHFL0W
[+] ThomDenholm|7 years ago|reply
While this book was originally published in 1973, it contains a lot of great information that is relevant today. I should re-read it...
[+] dsimms|7 years ago|reply
OMG, I totally forgot about this book. I loved it! Now, downloading ebook to reread it.
[+] cl0ne|7 years ago|reply
This is a fascinating book.
[+] jshowa3|7 years ago|reply
One of my heroes. I have his picture as my phone background so that I'm constantly reminded of the minds that built the device I use everyday.

When I saw that masters thesis. I literally hyperventilated.

[+] pstew|7 years ago|reply
The biography they reference, "A Mind at Play: How Claude Shannon Invented the Information Age", is on sale in the Kindle store right now: https://www.amazon.com/Mind-Play-Shannon-Invented-Informatio...
[+] tzs|7 years ago|reply
Not just at the Kindle store. The eBook is also on sale at Apple Books, Barnes & Noble, BAM!, Google Play, and Kobo. They are all $3.99, same as the Kindle sale price.
[+] d1str0|7 years ago|reply
I read this shortly after the Internet History Podcast episode with the authors. It is a good, detailed but bite-sized book. Very inspiring and very well written.
[+] purple_ducks|7 years ago|reply
Annoyingly, that book has never had a Kindle edition available in the UK Amazon.
[+] yters|7 years ago|reply
The odd thing about Shannon information is a coin flip can have just as much information as text, yet the former would intuitively seem devoid of information and the latter information rich. It is unclear what exactly is the relationship between Shannon information and what we intuitively consider information. Any ideas?
[+] princeofwands|7 years ago|reply
Shannon information deliberately only concerns syntactic information (information content). Other, more recent work, focuses on semantic information (information with meaning for a receiver).

> Shannon information theory provides various measures of so-called "syntactic information", which reflect the amount of statistical correlation between systems. In contrast, the concept of "semantic information" refers to those correlations which carry significance or "meaning" for a given system. Semantic information plays an important role in many fields, including biology, cognitive science, and philosophy, and there has been a long-standing interest in formulating a broadly applicable and formal theory of semantic information. In this paper we introduce such a theory. We define semantic information as the syntactic information that a physical system has about its environment which is causally necessary for the system to maintain its own existence. "Causal necessity" is defined in terms of counter-factual interventions which scramble correlations between the system and its environment, while "maintaining existence" is defined in terms of the system's ability to keep itself in a low entropy state.

https://arxiv.org/abs/1806.08053

Roughly speaking: The amount of computation or energy needed to perfectly reproduce a random source, such as a coin flip, is high, while the significance or meaning, for the average receiver, is low. Natural language text requires less computation to reproduce [1], but, for the average receiver, the significance is higher.

[1] http://languagelog.ldc.upenn.edu/myl/Shannon1950.pdf

[+] chronolitus|7 years ago|reply
Rather than see the coin flip as a useless toss, one might imagine it as a binary decision. This or that?

With this in mind, you can see text as the output of an algorithm (brain) taking many such decisions. The information entropy contained in this text reveals the complexity (amount of distinct binary decisions) which needed to take place in the machine (brain) in order for the text to occur.

[+] achillesheels|7 years ago|reply
This is a controversial opinion but: Claude Shannon is more noteworthy than Einstein. It’s simply the case engineering science is not taken seriously by the scientific community despite its overwhelming contributions to human development, namely digital signal processing and computer engineering.
[+] victornomad|7 years ago|reply
Information Theory was one of my favorite subjects @ uni. Wish I had more time to dig in now!
[+] llamaz|7 years ago|reply
What did you cover?

At my uni they only had communications theory, which covered stuff like software defined radios. Information theory was a significant part of it though.

[+] pbhowmic|7 years ago|reply
I would argue Shannon’s case is far from unique. For my money, Gauss was the greatest mathematical mind of all time but does the general public know of him.
[+] vixen99|7 years ago|reply
Dover's excellent 'An Introduction to Information Theory' by John Pierce is dedicated to Claude and Betty Shannon.