Claude Shannon's Masters Thesis was on the application of Boolean Algebra to circuits, effectively founding digital circuit design. That should have been enough for anyone, but not for Shannon. His later work on Information theory has proven important in everything from evolution to quantum mechanics (particularly, relative quantum entropy) and perhaps even to future physics (https://www.youtube.com/watch?v=td1fz5NLjQs).
He also did early work in cryptanalysis, AI (minimax chess algorithm and a learning robotic mouse). His student was Ivan Sutherland. More or less, whether its networking, signal processing, compression, crypto, machine learning, circuit design, basically anything to do with the digital age, you'll find Shannon did important foundational work there.
If I recall, Turing actually started to formalize some ideas in information theory, but stopped after meeting Shannon. Shannon showed Turing his information theory work and Turing decided Shannon already solved the question he was interested in.
It is a shame shannon doesn't get the recognition he deserves. Computer science has many fathers and as amazing as Turing was, nobody contributed more and nobody has a better claim to call himself the father of computer science than Shannon. Not Turing. Not Church. He outstrips them all but nobody outside of the computer science world has heard of him.
I cannot overstate this point and upvote it enough.
The reason Claude Shannon is a legend has a lot to do with the fact that his ideas are not just correct and draw from his multidisciplinary knowledge, but also expertly communicated. It is an incredibly readable paper and only absolutely requires a little bit of mathematics when he is describing how to convert a state machine for Morse code to a matrix which would allow someone to calculate how many bits per time unit can actually be transmitted in Morse code -- and this is not absolutely essential, it is calculated in a different way and also it is communicated that you can easily take his word for it that it is whatever numerical value it was. A lot of the arguments about fitting a stream of symbols into an encoding for noisy channels have these lovely diagrams that help to elucidate exactly what he's talking about.
If you want to make that same sort of impact, it is not just important to be a great mind, but to spend a bunch of time practicing how you communicate that information.
"The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning..."
(emphasis in the original)
I find that second sentence humorous every time I read it... I wonder if he had seen the current state of internet discussion if he would have changed "frequently" to "occasionally".
The ingenuity and brilliance coming from Bell Labs during that era is absolutely astonishing. Transistors, information theory, satellite communication, UNIX/C, the list goes on. These ideas unquestionably laid the foundation for modern high-tech society.
If anyone is interested in learning more about Bell Labs and the folks who worked there, “The Idea Factory” by Jon Gertner is a fantastic book written on the subject. It’s not comprehensive but it’s a very inspiring read.
There has always been something very vexing about Bell Labs’ legacy though. They had everything they needed to start the personal computing revolution: engineers, scientists, equipment, a nationwide telephone network for god’s sake. What happened?
Bell Systems was precluded from entering many markets (and forced to offer its patents to anyone) because of the 1956 consent decree that permitted its ongoing monopoly in telecommunications.
I heartily second the recommendation for "The Idea Factory"! I'm currently reading this book and aside from the seriously impressive run of successes, the characters are really quite amusing at times too. E.g., Shannon built a desktop calculator that operated using Roman numerals only ("THROBAC") in order to amuse himself. And some of the whimsical creations were pretty impressive in their own right. E.g., his maze-solving mouse "Theseus" learned the maze layout on progressive runs through the maze by using relay-based logic.
Bell Labs continued doing foundational research unfettered by marketability for a long time after the conventional story has them buried and gone. (To pick a personal favourite, B. F. Logan's work on Click Modulation is a bizarro-world stumble through arcane mathematics that pops out somewhere unexpected and valuable in the field of signal processing.)
Unfortunately, a lot of this research was buried in microfiche in university basements as Bell Labs' legacy was traded from company to company without a viable distribution mechanism. As a result, although this work is now all available on-line (sadly not in open-access form), there are still more forgotten gems in there than many researchers realize.
Even though that brought the lab's demise, its then-younger staff and their mentees continued their work at other organizations, including many at Google.
Xerox Parc had early personal computers and developed GUI/mouse driven software with email, word processor and paint programs and an IDE meant for kids to be able to learn and use. But it was some college dropouts in their garages that brought it to the masses.
One of the more surprising things I remember them researching (in their earlier years) was different wood to be used for telephone poles. IIRC they ran experiments for several years to see what type of wood lasted the longest in different weather conditions. All this to drive down costs in the long term.
A new Shannon biography, A Mind at Play: How Claude Shannon Invented the Information Age, may help reverse this legacy. Authors Jimmy Soni and Rob Goodman make a strong bid to expose Shannon’s work to a popular audience, balancing a chronological narrative, the “Eureka!” moments that sprang from his disciplined approach to solving puzzles, and his propensity for playfulness.
Another book with a lengthy section on Claude Shannon is James Gleick's The Information: A History, a Theory, a Flood [1.]. A shorter but still nice explanation in on Brain Pickings [2.].
Is he really unknown? I know most of us don’t do much signal processing these days but you can’t get far in studying compression without hearing about Shannon.
A Mind At Play is an excellent biography of a fascinating mathematician and visionary thinker. Ranges from early life through final years and also does a decent job of introducing his Mathematical Theory of Communication, the basis of modern information theory. A solid 4½ stars.
Shannon was a speaker at my commencement at CMU. He struck me as kind of a self-effacing, amusing sort of character. Maybe like a much quieter, less ego-driven version of Feynman.
Also: as much as people talk about information theory in various contexts, I doubt that many take the trouble to understand it better. Like thermodynamics, people want to take away an overly broad, folksy interpretation and apply it everywhere without stopping to think if it really applies in the way they claim.
I'm not sure what you mean here, so it may be worth elaborating. I think a lot of people understand how theory, and certainly in communication theory everyone understands the famous equation. I don't see anyone claiming it applies in places it doesn't.
Ok his fundamental theorems aren't strictly binary. He wrote of discerning possible 'symbols' from noisy measurement where a symbol could have N possible values. The optimum discernment was at 2 values (1 and 0) but his math handled any base.
"Anything can be communicated over a noisy channel without error" - Was something my Elec 201 professor would confidently proclaim before he would go into a lengthy aside about how amazing Claude Shannon is.
That was nearly 20 years ago.
Everyday I appreciate that statement, and Claude Shannon, a little bit more.
He also showed the limits of the efficient market hypothesis in economics. Add in Einstein (information cannot travel faster than the speed of light), and Kolmogorov (random information cannot be compressed) to get 90% of why markets cannot be fully efficient.
It's been forever since I read it, but Fortune's Formula [1] is an entertaining read that goes into Shannon's investment methodology, alongside Edward Thorp [2].
There's a kind of paradox of shannon entropy found in communications with double-entendre I think, in that they appear to increase both redundancy and entropy. I touched on that in this essay proposing what i call subtractive adversarial networks.
Not just at the Kindle store. The eBook is also on sale at Apple Books, Barnes & Noble, BAM!, Google Play, and Kobo. They are all $3.99, same as the Kindle sale price.
I read this shortly after the Internet History Podcast episode with the authors. It is a good, detailed but bite-sized book. Very inspiring and very well written.
Claude Shannon had an interest in AI and had a hobby of making electronic game machines, most of which survive to this day in the MIT Museum. A list with descriptions and photos is available here:
The odd thing about Shannon information is a coin flip can have just as much information as text, yet the former would intuitively seem devoid of information and the latter information rich. It is unclear what exactly is the relationship between Shannon information and what we intuitively consider information. Any ideas?
Shannon information deliberately only concerns syntactic information (information content). Other, more recent work, focuses on semantic information (information with meaning for a receiver).
> Shannon information theory provides various measures of so-called "syntactic information", which reflect the amount of statistical correlation between systems. In contrast, the concept of "semantic information" refers to those correlations which carry significance or "meaning" for a given system. Semantic information plays an important role in many fields, including biology, cognitive science, and philosophy, and there has been a long-standing interest in formulating a broadly applicable and formal theory of semantic information. In this paper we introduce such a theory. We define semantic information as the syntactic information that a physical system has about its environment which is causally necessary for the system to maintain its own existence. "Causal necessity" is defined in terms of counter-factual interventions which scramble correlations between the system and its environment, while "maintaining existence" is defined in terms of the system's ability to keep itself in a low entropy state.
Roughly speaking: The amount of computation or energy needed to perfectly reproduce a random source, such as a coin flip, is high, while the significance or meaning, for the average receiver, is low. Natural language text requires less computation to reproduce [1], but, for the average receiver, the significance is higher.
Rather than see the coin flip as a useless toss, one might imagine it as a binary decision. This or that?
With this in mind, you can see text as the output of an algorithm (brain) taking many such decisions. The information entropy contained in this text reveals the complexity (amount of distinct binary decisions) which needed to take place in the machine (brain) in order for the text to occur.
This is a controversial opinion but: Claude Shannon is more noteworthy than Einstein. It’s simply the case engineering science is not taken seriously by the scientific community despite its overwhelming contributions to human development, namely digital signal processing and computer engineering.
At my uni they only had communications theory, which covered stuff like software defined radios. Information theory was a significant part of it though.
I would argue Shannon’s case is far from unique. For my money, Gauss was the greatest mathematical mind of all time but does the general public know of him.
[+] [-] Cybiote|7 years ago|reply
He also did early work in cryptanalysis, AI (minimax chess algorithm and a learning robotic mouse). His student was Ivan Sutherland. More or less, whether its networking, signal processing, compression, crypto, machine learning, circuit design, basically anything to do with the digital age, you'll find Shannon did important foundational work there.
[+] [-] tnecniv|7 years ago|reply
[+] [-] porpoisely|7 years ago|reply
[+] [-] jiggunjer|7 years ago|reply
[+] [-] lixtra|7 years ago|reply
[+] [-] crdrost|7 years ago|reply
The reason Claude Shannon is a legend has a lot to do with the fact that his ideas are not just correct and draw from his multidisciplinary knowledge, but also expertly communicated. It is an incredibly readable paper and only absolutely requires a little bit of mathematics when he is describing how to convert a state machine for Morse code to a matrix which would allow someone to calculate how many bits per time unit can actually be transmitted in Morse code -- and this is not absolutely essential, it is calculated in a different way and also it is communicated that you can easily take his word for it that it is whatever numerical value it was. A lot of the arguments about fitting a stream of symbols into an encoding for noisy channels have these lovely diagrams that help to elucidate exactly what he's talking about.
If you want to make that same sort of impact, it is not just important to be a great mind, but to spend a bunch of time practicing how you communicate that information.
[+] [-] dbcurtis|7 years ago|reply
(emphasis in the original)
I find that second sentence humorous every time I read it... I wonder if he had seen the current state of internet discussion if he would have changed "frequently" to "occasionally".
[+] [-] JumpCrisscross|7 years ago|reply
[1] https://www.amazon.com/Introduction-Information-Theory-Symbo...
[+] [-] Zanneth|7 years ago|reply
If anyone is interested in learning more about Bell Labs and the folks who worked there, “The Idea Factory” by Jon Gertner is a fantastic book written on the subject. It’s not comprehensive but it’s a very inspiring read.
There has always been something very vexing about Bell Labs’ legacy though. They had everything they needed to start the personal computing revolution: engineers, scientists, equipment, a nationwide telephone network for god’s sake. What happened?
[+] [-] macintux|7 years ago|reply
A formal study of some of the consequences:
https://economics.yale.edu/sites/default/files/how_antitrust...
[+] [-] cfallin|7 years ago|reply
[+] [-] gsmecher|7 years ago|reply
Unfortunately, a lot of this research was buried in microfiche in university basements as Bell Labs' legacy was traded from company to company without a viable distribution mechanism. As a result, although this work is now all available on-line (sadly not in open-access form), there are still more forgotten gems in there than many researchers realize.
[+] [-] gowld|7 years ago|reply
Even though that brought the lab's demise, its then-younger staff and their mentees continued their work at other organizations, including many at Google.
[+] [-] goatlover|7 years ago|reply
[+] [-] dmux|7 years ago|reply
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] davedx|7 years ago|reply
[+] [-] adolph|7 years ago|reply
Another book with a lengthy section on Claude Shannon is James Gleick's The Information: A History, a Theory, a Flood [1.]. A shorter but still nice explanation in on Brain Pickings [2.].
1. https://www.goodreads.com/book/show/8701960-the-information 2. https://www.brainpickings.org/2016/09/06/james-gleick-the-in...
[+] [-] hinkley|7 years ago|reply
[+] [-] udfalkso|7 years ago|reply
[+] [-] ThomDenholm|7 years ago|reply
[+] [-] Isamu|7 years ago|reply
Also: as much as people talk about information theory in various contexts, I doubt that many take the trouble to understand it better. Like thermodynamics, people want to take away an overly broad, folksy interpretation and apply it everywhere without stopping to think if it really applies in the way they claim.
[+] [-] shaklee3|7 years ago|reply
[+] [-] garf_|7 years ago|reply
https://books.google.com/books/about/The_Idea_Factory.html?i...
[+] [-] JoeAltmaier|7 years ago|reply
If I remember right after 20 years.
[+] [-] an-allen|7 years ago|reply
That was nearly 20 years ago.
Everyday I appreciate that statement, and Claude Shannon, a little bit more.
[+] [-] crb002|7 years ago|reply
[+] [-] argonium|7 years ago|reply
[+] [-] johnjac|7 years ago|reply
[+] [-] ThomDenholm|7 years ago|reply
[+] [-] dsimms|7 years ago|reply
[+] [-] cl0ne|7 years ago|reply
[+] [-] dredmorbius|7 years ago|reply
[+] [-] insaneirish|7 years ago|reply
[1]: https://www.amazon.com/Fortunes-Formula-Scientific-Betting-C... [2]: https://en.wikipedia.org/wiki/Edward_O._Thorp
[+] [-] jshowa3|7 years ago|reply
When I saw that masters thesis. I literally hyperventilated.
[+] [-] nicholast|7 years ago|reply
https://medium.com/@_NicT_/subtractive-adversarial-networks-...
[+] [-] pstew|7 years ago|reply
[+] [-] tzs|7 years ago|reply
[+] [-] d1str0|7 years ago|reply
[+] [-] purple_ducks|7 years ago|reply
[+] [-] JoeDaDude|7 years ago|reply
https://www.boardgamegeek.com/geeklist/143233/claude-shannon...
[+] [-] yters|7 years ago|reply
[+] [-] princeofwands|7 years ago|reply
> Shannon information theory provides various measures of so-called "syntactic information", which reflect the amount of statistical correlation between systems. In contrast, the concept of "semantic information" refers to those correlations which carry significance or "meaning" for a given system. Semantic information plays an important role in many fields, including biology, cognitive science, and philosophy, and there has been a long-standing interest in formulating a broadly applicable and formal theory of semantic information. In this paper we introduce such a theory. We define semantic information as the syntactic information that a physical system has about its environment which is causally necessary for the system to maintain its own existence. "Causal necessity" is defined in terms of counter-factual interventions which scramble correlations between the system and its environment, while "maintaining existence" is defined in terms of the system's ability to keep itself in a low entropy state.
https://arxiv.org/abs/1806.08053
Roughly speaking: The amount of computation or energy needed to perfectly reproduce a random source, such as a coin flip, is high, while the significance or meaning, for the average receiver, is low. Natural language text requires less computation to reproduce [1], but, for the average receiver, the significance is higher.
[1] http://languagelog.ldc.upenn.edu/myl/Shannon1950.pdf
[+] [-] laretluval|7 years ago|reply
https://arxiv.org/abs/physics/0004057
[+] [-] chronolitus|7 years ago|reply
With this in mind, you can see text as the output of an algorithm (brain) taking many such decisions. The information entropy contained in this text reveals the complexity (amount of distinct binary decisions) which needed to take place in the machine (brain) in order for the text to occur.
[+] [-] achillesheels|7 years ago|reply
[+] [-] victornomad|7 years ago|reply
[+] [-] llamaz|7 years ago|reply
At my uni they only had communications theory, which covered stuff like software defined radios. Information theory was a significant part of it though.
[+] [-] pbhowmic|7 years ago|reply
[+] [-] vixen99|7 years ago|reply