OwlTail

Cover image of Claude Shannon

Claude Shannon

13 Podcast Episodes

Latest 3 Dec 2022 | Updated Daily

Episode artwork

Episode 103: InfC # 103 - Claude Shannon

Intervalo de Confiança

Hoje é dia do "Influencers da Ciência", um Spin-Off do podcast "Intervalo de Confiança". Neste programa trazemos o nome de Influencers que de fato trouxeram algo de positivo para a sociedade, aqueles que expandiram as fronteiras do conhecimento científico e hoje permitiram o desenvolvimento de diversas áreas. Neste episódio, Igor Alcantara fala de um dos mais importantes cientistas do Século XX, aquele a quem alguns colocam como tão importante quanto Albert Einstein e Isaac Newton. Suas descobertas permitiram o desenvolvimento de computadores, Internet e até da Inteligência Artificial. Sem seu trabalho nada da tecnologia digital ou da comunicação moderna seria possível. Por esse e outros motivos, costumamos dizer que Claude Shannon é o cientista que inventou o futuro.Gravou esse programa Igor Alcantara. A pauta foi escrita por  Igor Alcantara. A edição foi feita por Leo Oliveira e a vitrine do episódio por Diego Madeira. As vinhetas de todos os episódios foram compostas por Rafael Chino e Leo Oliveira. Visite nosso site em https://intervalodeconfianca.com.br

1hr 3mins

9 Sep 2021

Episode artwork

TechStuff Classic: Who was Claude Shannon?

TechStuff

TechStuff salutes an incredibly influential (and yet relatively unknown) tech genius: Claude Shannon. What did he do? Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.

42mins

27 Aug 2021

Similar People

Episode artwork

2x09 - Claude Shannon, Κρυπτογραφία, Κβαντικοί Υπολογιστές, και μια Τρομπέτα Φλογοβόλο (Guest Αλέξανδρος Δημάκης)

Not a Top 10

Επικοινωνία hello@notatop10.fm @notatop10 @timaras @giorgos.dimop Guest ο Αλέξανδρος Δημάκης, Καθηγητής στο University of Texas at Austin. Twitter: @AlexGDimakis. Clause Shannon Θεωρία Πληροφορίας Bits & bytes Κρυπτογραφία Κβαντικοί Υπολογιστές Κβαντική Υπεροχή P vs NP …. Και άλλα πολλά!

51mins

10 Jun 2021

Episode artwork

A Mind at Play How Claude Shannon Invented the Information Age, by Jimmy Soni

Bookey App 30 mins Book Summaries Knowledge Notes and More

Claude Elwood Shannon is widely regarded as the father of information theory. His research has given birth to an entirely new theory on information and communication, laying the foundations for the information age today. Shannon was a groundbreaking polymath, a brilliant tinkerer, and a digital pioneer. He was deft at simplifying a problem and abstracting its essence. He was also a master at building links and drawing analogies between different things to find new ways to solve complex problems. Shannon’s achievements were largely attributed to his curious and playful mind. He viewed serious research questions as puzzles, and he enjoyed solving them. This book, "A Mind at Play: How Claude Shannon Invented the Information Age", decodes Shannon's life and experiences. It also offers an insight into Shannon's personal traits that enabled his achievements.

15mins

22 Apr 2021

Most Popular

Episode artwork

Claude Shannon - Pai da Teoria da Informação

Teorema de shannon

Um breve resumo sobre o Teorema de Shannon. 

3mins

5 Mar 2021

Episode artwork

#50 – The Incredible Mind of Claude Shannon with Jimmy Soni & Mark Levinson (Personal Development)

The Good Life Podcast with Sean Murray

Our topic this week is Claude Shannon, a mathematician and engineer known as the Father of Information theory for his landmark paper "A Mathematical Theory of Communication", which he published in 1948. Shannon's seminal work and discoveries ushered in the digital age, and for that alone, his life is worthy of study, but Shannon also had this other remarkable quality to his life – a very playful and creative mind. Shannon was always curious, and he devoted his considerable intellect to a diverse range of activities and interests, that included juggling, unicycles, artificial intelligence, chess playing machines, wearable computers – he even built a chairlift on his property. He was both a mathematical and creative genius.My guests today are Jimmy Soni and Mark Levinson.  Jimmy co-authored a biography of Shannon titled A Mind at Play and Mark directed a documentary about Shannon called The Bit Player which is available on Amazon Prime. In both of these works, Jimmy and Mark seek to explore the incredible mind of Claude Shannon. In this episode we seek to distill the secrets to Shannon's creativity, and we talk about how we can apply these lessons to our own lives.What You’ll Learn:·     The importance of Information Theory and Shannon’s significant contribution to the digital computer and the information age·     How Shannon relied on intuition as a guide when solving mathematical problems·     How Shannon would take a complex problem, simplify it down to its essence, and then build it back up to uncover deep hidden truths that eluded his colleagues·     How Shannon’s disregard for prestige and awards freed up his time and mind for creative work·     How pursuing “trivial” and “whimsical” projects would often lead Shannon to profound insights·     How Shannon used creative dissatisfaction to drive innovation·     How we can apply Shannon’s creative techniques to our own livesResourcesA Mind at Play by Jimmy Soni & Rob GoodmanThe Bit Player directed by Mark Levinson11 Life Lessons from History’s Most Underrated Genius by Soni & Goodman“A Mathematical Theory of Communication” by Claude ShannonConnect with Jimmy Soni & Mark LevinsonLinkedIn for Jimmy: https://www.linkedin.com/in/jimmysoni/LinkedIn for Mark: https://www.linkedin.com/in/mark-levinson-b400475/Connect with Sean MurrayEmail: seanm@realtimeperformance.comTwitter: @seanpmurray111LinkedIn: https://www.linkedin.com/in/seanpmurray/Website: www.seanpmurray.netNewsletter: https://www.realtimeperformance.com/newsletter/See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

57mins

8 Feb 2021

Episode artwork

Claude Shannon and the Origins of Information Theory

The History of Computing

The name Claude Shannon has come up 8 times so far in this podcast. More than any single person. We covered George Boole and the concept that Boolean is a 0 and a 1 and that using Boolean algebra, you can abstract simple circuits into practically any higher level concept. And Boolean algebra had been used by a number of mathematicians, to perform some complex tasks. Including by Lewis Carroll in Through The Looking Glass to make words into math.  And binary had effectively been used in morse code to enable communications over the telegraph.  But it was Claude Shannon who laid the foundation for making a theory that took both the concept of communicating over the telegraph and applying Boolean algebra to get to a higher level of communication possible. And it all starts with bits, which we can thank Shannon for.  Shannon grew up in Gaylord, Michigan. His mother was a high school principal and his grandfather had been an inventor. He built a telegraph as a child, using a barbed wire fence. But barbed wire isn’t the greatest conducer of electricity and so… noise. And thus information theory began to ruminate in his mind. He went off to the University of Michigan and got a Bachelors in electrical engineering and another in math. A perfect combination for laying the foundation of the future.  And he got a job as a research assistant to Vannevar Bash, who wrote the seminal paper, As We May Think. At that time, Bush was working at MIT on The Thinking Machine, or Differential Analyzer. This was before World War II and they had no idea, but their work was about to reshape everything.  At the time, what we think of as computers today, were electro-mechanical. They had gears that were used for the more complicated tasks, and switches, used for simpler tasks.  Shannon devoted his masters thesis to applying Boolean algebra, thus getting rid of the wheels, which moved slowly, and allowing the computer to go much faster. He broke down Boole’s Laws of Thought into a manner it could be applied to parallel circuitry. That paper was called A Symbolic Analysis of Relay and Switching Circuits in 1937 and helped set the stage for the Hackers revolution that came shortly thereafter at MIT.  At the urging of Vannevar Bush, he got his PhD in Biology, pushing genetics forward by theorizing that you could break the genetic code down into a matrix. The structure of DNA would be discovered by George Gamow in 1953 and Watson and Crick would discover the helix and Rosalind Franklin would use X-ray crystallography to capture the first photo of the structure.  He headed off to Princeton in 1940 to work at the Institute for Advanced Study, where Einstein and von Neumann were. He quickly moved over to the National Defense Research Committee, as the world was moving towards World War II. A lot of computing was going into making projectiles, or bombs, more accurate. He co-wrote a paper called Data Smoothing and Prediction in Fire-Control Systems during the war.  He’d gotten a primer in early cryptography, reading The Gold-Bug by Edgar Allan Poe as a kid. And it struck his fancy. So he started working on theories around cryptography, everything he’d learned forming into a single theory. He would have lunch with Alan Turning during the war. He would And it was around this work that he first coined the term “information theory” in 1945. A universal theory of communication gnawed at him and formed during this time, from the Institute, to the National Defense Research Committee, to Bell Labs, where he helped encrypt communications between world leaders. He hid it from everyone, including failed relationships. He broke information down into the smallest possible unit, a bit, short for a binary digit. He worked out how to compress information that was most repetitive. Similar to how morse code compressed the number of taps on the electrical wire by making the most common letters the shortest to send. Eliminating redundant communications established what we now call compression.  Today we use the term lossless compression frequently in computing. He worked out that the minimum amount of information to send would be H = - Sigma Pi log2 Pi - or entropy.  His paper, put out while he was at Bell, was called “A mathematical theory or communication” and came out in 1948. You could now change any data to a zero or a one and then compress it. Further, he had to find a way to calculate the maximum amount of information that could be sent over a communication channel before it became garbled, due to loss. We now call this the Shannon Limit. And so once we have that, he derived how to analyze information with math to correct for noise. That barbed wire fence could finally be useful. This would be used in all modern information connectivity. For example, when I took my Network+ we spent an inordinate amount of time learning about Carrier-sense multiple access with collision detection (CSMA/CD) - a media access control (MAC) method that used carrier-sensing to defer transmissions until no other stations are transmitting. And as his employer, Bell Labs helped shape the future of computing. Along with Unix, C, C++, the transistor, the laser, information theory is a less tangible yet given what we all have in our pockets on on our wrists these days, more tangible discovery. Having mapped the limits, Bell started looking to reach the limit. And so the digital communication age was born when the first modem would come out of his former employer, Bell Labs, in 1958. And just across the way in Boston, ARPA would begin working on the first Interface Message Processor in 1967, the humble beginnings of the Internet. His work done, he went back to MIT. His theories were applied to all sorts of disciplines. But he comes in less and less. Over time we started placing bits on devices. We started retrieving those bits. We started compressing data. Digital images, audio, and more. It would take 35 or so years  He consulted with the NSA on cryptography. In 1949 he published Communication Theory of Secrecy Systems,  pushed cryptography to the next level. His paper Prediction and Entropy of Printed English in 1951 practically created the field of natural language processing, which evolved into various branches of machine learning. He helped give us the Nyquist–Shannon sampling theorem, used in aliasing, deriving maximum throughput, RGB, and of course signal to noise.  He loved games. In 1941 he theorized the Shannon Number, or the game-tree complexity of chess. In case you’re curious, the reason deep blue can win at chess is that it can brute force 10 to the 120th power. His love of games continued and in 1949 he presented Programming a Computer for Playing Chess. That was the first time we thought about computers playing chess. And he’d have a standing bet that a computer would beat a human grand master at chess by 2001. Garry Kasparov lost to Deep Blue in 1997. That curiosity extended far beyond chess. He would make Theseus in 1950 - a maze with a mouse that learned how to escape, using relays from phone switches. One of the earliest forms of machine learning. In 1961 he would co-invent the first wearable computer to help win a game of roulette. That same year he designed the Minivan 601 to help teach how computers worked.  So we’ll leave you with one last bit of information. Shannon’s maxim is that “the enemy knows the system.” I used to think it was just a shortened version of Kerckhoffs's principle, which is that it should be possible to understand a cryptographic system, for example, modern public key ciphers, but not be able to break the encryption without a private key. Thing is, the more I know about Shannon the more I suspect that what he was really doing was giving the principle a broader meaning. So think about that as you try and decipher what is and what is not disinformation in such a noisy world.  Lots and lots of people would cary on the great work in information theory. Like Kullback–Leibler divergence, or relative entropy. And we owe them all our thanks. But here’s the thing about Shannon: math. He took things that could have easily been theorized - and he proved them. Because science can refute disinformation. If you let it. 

11mins

9 Sep 2020

Episode artwork

Robert Hacker - Claude Shannon (part 5/6 the 5 geniuses in microeconomics)

Socratic

Part 5/6 in the “5 geniuses in micro-economics”. Mr. Hacker explores Claude Shannon and his contributions to Information Theory.

41mins

29 Jul 2020

Episode artwork

Claude Shannon

STEM Time with Navya and Aishwarya

In this podcast episode, we will discuss the early life, career, and impact of Claude Shannon! Hope you enjoy!--- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app

7mins

26 Jun 2020

Episode artwork

#95 A Mind at Play: How Claude Shannon Invented the Information Age

Founders

What I learned from reading A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni and Rob GoodmanUpgrade to the Misfit feed to automatically unlock every full length episode. More exclusive full-length episodes are added every week. As a bonus, you will also get lifetime access to my notebook that contains key insights from over 285 podcasts and lectures on entrepreneurship. Founders is the Costco of podcasts. You won't believe the value you get for such little money.  Upgrade now by tapping this link.

1hr 9mins

27 Oct 2019

Loading