diff --git a/scripts/optimize-images.js b/scripts/optimize-images.js index 1588587..63a6869 100644 --- a/scripts/optimize-images.js +++ b/scripts/optimize-images.js @@ -7,9 +7,9 @@ * with 85% JPEG quality to match consistent format. */ -const sharp = require('sharp'); -const fs = require('fs'); -const path = require('path'); +import sharp from 'sharp'; +import fs from 'fs'; +import path from 'path'; const IMAGES_DIR = './src/assets/images/entities'; const TARGET_WIDTH = 440; diff --git a/src/assets/images/entities/claude-shannon.jpg b/src/assets/images/entities/claude-shannon.jpg new file mode 100644 index 0000000..86b1cd2 Binary files /dev/null and b/src/assets/images/entities/claude-shannon.jpg differ diff --git a/src/content/institutions/bell-labs.mdx b/src/content/institutions/bell-labs.mdx new file mode 100644 index 0000000..27a7238 --- /dev/null +++ b/src/content/institutions/bell-labs.mdx @@ -0,0 +1,70 @@ +--- +id: bell-labs +type: institution +name: Bell Labs +kind: laboratory +era: 1920s–present +location: Murray Hill, New Jersey, USA +domains: + - Computing + - Mathematics + - Telecommunications + - Electrical Engineering +edges: [] +links: + - label: Wikipedia + url: https://en.wikipedia.org/wiki/Bell_Labs + - label: Official Site + url: https://www.bell-labs.com/ + - label: Nokia Bell Labs + url: https://www.nokia.com/bell-labs/ +--- + +Bell Telephone Laboratories (Bell Labs), founded in 1925, is one of the most prolific research laboratories in history. Originally the research arm of AT&T, it produced fundamental innovations in computing, communications, and physics that shaped the modern world. + +## Nobel Prize Achievements + +Bell Labs researchers have won nine Nobel Prizes in Physics, including: + +- **1937**: Clinton Davisson for electron diffraction +- **1956**: William Shockley, John Bardeen, and Walter Brattain for the transistor +- **1978**: Arno Penzias and Robert Wilson for discovering cosmic microwave background radiation +- **1997**: Steven Chu for laser cooling of atoms +- **2009**: Willard Boyle and George Smith for the CCD sensor + +## Computing Contributions + +Bell Labs made foundational contributions to computing: + +**Information Theory**: Claude Shannon developed information theory at Bell Labs, publishing "A Mathematical Theory of Communication" in 1948[1]. This work founded the mathematical study of communication and data transmission. + +**The Transistor** (1947): Bardeen, Brattain, and Shockley invented the transistor, which replaced vacuum tubes and made modern electronics possible. + +**Unix** (1969): Ken Thompson and Dennis Ritchie created Unix, which influenced virtually all modern operating systems. + +**C Programming Language** (1972): Dennis Ritchie developed C, one of the most influential programming languages ever created. + +**C++** (1979): Bjarne Stroustrup began developing C++, extending C with object-oriented features. + +## Other Innovations + +Bell Labs also produced: + +- The laser (1958) +- Communication satellites +- Digital signal processing +- Error-correcting codes +- The photovoltaic cell + +## Legacy + +At its peak, Bell Labs employed 25,000 people. Though it has changed ownership multiple times (now part of Nokia), Bell Labs remains one of the most important research institutions in technology history. + +--- + +## Sources + +1. IEEE. ["Claude Shannon."](https://www.itsoc.org/about/shannon) + Shannon's work at Bell Labs. +2. Wikipedia. ["Bell Labs."](https://en.wikipedia.org/wiki/Bell_Labs) + History and achievements. diff --git a/src/content/institutions/massachusetts-institute-of-technology.mdx b/src/content/institutions/massachusetts-institute-of-technology.mdx new file mode 100644 index 0000000..40d621d --- /dev/null +++ b/src/content/institutions/massachusetts-institute-of-technology.mdx @@ -0,0 +1,62 @@ +--- +id: massachusetts-institute-of-technology +type: institution +name: Massachusetts Institute of Technology +kind: university +era: 1860s–present +location: Cambridge, Massachusetts, USA +domains: + - Computing + - Mathematics + - Electrical Engineering + - Science +edges: [] +links: + - label: Wikipedia + url: https://en.wikipedia.org/wiki/Massachusetts_Institute_of_Technology + - label: Official Site + url: https://www.mit.edu/ + - label: CSAIL + url: https://www.csail.mit.edu/ +--- + +The Massachusetts Institute of Technology (MIT), founded in 1861, is one of the world's most prestigious research universities. It has been central to the development of computing, producing foundational theories, influential researchers, and groundbreaking technologies. + +## Computing Pioneers + +MIT has produced and hosted numerous computing pioneers: + +**Claude Shannon**: Wrote his revolutionary master's thesis at MIT (1937), showing that Boolean algebra could design digital circuits—the theoretical foundation of all modern computers[1]. + +**Vannevar Bush**: Developed the differential analyzer (1931), an important analog computer that Shannon worked on. Later proposed the "Memex," a conceptual precursor to hypertext. + +**Project MAC**: MIT's pioneering time-sharing research project (1963) developed Multics, which influenced Unix and modern operating systems. + +## AI Laboratory + +MIT's Artificial Intelligence Laboratory, founded by John McCarthy and Marvin Minsky in 1959, was one of the birthplaces of artificial intelligence research. The lab pioneered: + +- Lisp programming language +- Natural language processing +- Computer vision +- Robotics + +The AI Lab merged with the Laboratory for Computer Science in 2003 to form CSAIL (Computer Science and Artificial Intelligence Laboratory). + +## Notable Contributions + +- **TX-0 and TX-2**: Early transistorized computers at Lincoln Laboratory +- **Spacewar!**: One of the first video games, created at MIT in 1962 +- **Project Athena**: Pioneering networked computing environment (1983) +- **World Wide Web Consortium (W3C)**: Co-hosted at MIT since 1994 +- **One Laptop per Child**: MIT Media Lab initiative to provide affordable computers globally + +--- + +## Sources + +1. MIT Museum. ["Claude Shannon."](https://mitmuseum.mit.edu/) Shannon's + thesis work at MIT. +2. Wikipedia. ["MIT Computer Science and Artificial Intelligence + Laboratory."](https://en.wikipedia.org/wiki/MIT_Computer_Science_and_Artificial_Intelligence_Laboratory) + History of computing at MIT. diff --git a/src/content/people/claude-shannon.mdx b/src/content/people/claude-shannon.mdx new file mode 100644 index 0000000..cf308b4 --- /dev/null +++ b/src/content/people/claude-shannon.mdx @@ -0,0 +1,112 @@ +--- +id: claude-shannon +type: person +name: Claude Shannon +title: Father of Information Theory +era: 1930s–1990s +domains: + - Computing + - Mathematics + - Information Theory + - Cryptography + - Electrical Engineering +edges: + - target: a-mathematical-theory-of-communication + kind: influence + label: created + year: 1948 + - target: symbolic-analysis-of-relay-and-switching-circuits + kind: influence + label: created + year: 1937 + - target: massachusetts-institute-of-technology + kind: affiliation + label: studied at + year: 1936 + - target: bell-labs + kind: affiliation + label: worked at + year: 1941 +signatureWorks: + - a-mathematical-theory-of-communication + - symbolic-analysis-of-relay-and-switching-circuits +whyYouCare: + - Founded information theory, the mathematical study of communication that underlies all digital technology + - Proved that Boolean algebra could design digital circuits—the theoretical foundation of every computer + - Introduced the "bit" as the fundamental unit of information, giving us the language of the digital age + - Established fundamental limits on data compression and error-free transmission that engineers still work with today + - His work on cryptography during WWII helped secure Allied communications +links: + - label: Wikipedia + url: https://en.wikipedia.org/wiki/Claude_Shannon + - label: IEEE Information Theory Society + url: https://www.itsoc.org/about/shannon + - label: Britannica + url: https://www.britannica.com/biography/Claude-Shannon + - label: Wikimedia Commons + url: https://commons.wikimedia.org/wiki/Category:Claude_Shannon +image: + file: ../../assets/images/entities/claude-shannon.jpg + source: https://commons.wikimedia.org/wiki/File:ClaudeShannon_MFO3807.jpg + license: CC BY-SA 2.0 DE + author: Konrad Jacobs (1963) +--- + +Claude Elwood Shannon (1916–2001) was an American mathematician, electrical engineer, and cryptographer known as the "father of information theory." His work laid the theoretical foundations for the digital age, establishing the mathematical framework for communication, computation, and data storage that underlies virtually all modern technology. + +## Early Life and Education + +Shannon was born on April 30, 1916, in Petoskey, Michigan. His father was a businessman and judge; his mother was a language teacher who became principal of Gaylord High School. Growing up in Gaylord, Shannon showed an early aptitude for mechanical and electrical tinkering, building model airplanes, a radio-controlled boat, and even a barbed-wire telegraph system to a friend's house half a mile away. + +In 1932, Shannon entered the University of Michigan, where he earned dual degrees in electrical engineering and mathematics in 1936—a combination that would prove prophetic for his later work bridging both fields. + +## The Most Important Master's Thesis + +Shannon moved to MIT in 1936 to work as a research assistant to Vannevar Bush on the differential analyzer, the most advanced calculating machine of the time. While servicing the machine's relay circuits, Shannon recognized that the two-valued logic of switches corresponded to George Boole's symbolic logic. + +His 1937 master's thesis, "A Symbolic Analysis of Relay and Switching Circuits," demonstrated that Boolean algebra could be used to design and simplify electrical circuits[1]. This insight—that logical operations could be implemented in hardware—became the theoretical foundation of digital computing. Computer scientist Herman Goldstine called it "surely one of the most important master's theses ever written." + +## Bell Labs and Information Theory + +After completing his PhD in mathematics at MIT in 1940, Shannon joined Bell Labs in 1941. During World War II, he worked on cryptography and fire-control systems, including the secure communication system used by Roosevelt and Churchill. + +In 1948, Shannon published "A Mathematical Theory of Communication" in the Bell System Technical Journal[2]. This paper founded information theory by: + +- **Defining information mathematically** through the concept of entropy +- **Introducing the bit** as the fundamental unit of information +- **Proving the source coding theorem**, establishing limits on data compression +- **Proving the noisy channel coding theorem**, showing that reliable communication over noisy channels is possible up to a fundamental limit + +Historian James Gleick rated the paper as the most important development of 1948—more significant even than the transistor. Scientific American called it the "Magna Carta of the Information Age." + +## The Playful Genius + +Shannon was known for approaching research with curiosity, humor, and play. At Bell Labs, he famously rode a unicycle through the hallways while juggling. His playful inventions included: + +- **Theseus** (1950): An electronic mouse that could navigate a maze and learn from experience, an early demonstration of machine learning +- **Chess-playing machines**: Pioneering work that helped establish artificial intelligence +- **The Ultimate Machine**: A box whose sole function was to turn itself off when switched on + +Shannon also wrote papers on juggling and designed a juggling robot, and he built a calculator that operated in Roman numerals. + +## Later Career and Legacy + +Shannon joined MIT's faculty in 1956 while maintaining ties to Bell Labs. He received numerous honors, including the National Medal of Science (1966), the Kyoto Prize, and the IEEE Medal of Honor. + +Shannon developed Alzheimer's disease in his later years and died on February 24, 2001, at age 84. His achievements are often compared to those of Einstein, Newton, and Darwin. + +Roboticist Rodney Brooks declared Shannon "the 20th century engineer who contributed the most to 21st century technologies." The AI language model Claude was named in his honor[3]. + +--- + +## Sources + +1. Wikipedia. ["A Symbolic Analysis of Relay and Switching + Circuits."](https://en.wikipedia.org/wiki/A_Symbolic_Analysis_of_Relay_and_Switching_Circuits) + Details Shannon's master's thesis. +2. Wikipedia. ["A Mathematical Theory of + Communication."](https://en.wikipedia.org/wiki/A_Mathematical_Theory_of_Communication) Overview + of Shannon's information theory paper. +3. Quanta Magazine. ["How Claude Shannon Invented the + Future."](https://www.quantamagazine.org/how-claude-shannons-information-theory-invented-the-future-20201222/) + Assessment of Shannon's lasting impact. diff --git a/src/content/works/a-mathematical-theory-of-communication.mdx b/src/content/works/a-mathematical-theory-of-communication.mdx new file mode 100644 index 0000000..c609476 --- /dev/null +++ b/src/content/works/a-mathematical-theory-of-communication.mdx @@ -0,0 +1,81 @@ +--- +id: a-mathematical-theory-of-communication +type: work +name: A Mathematical Theory of Communication +kind: paper +era: 1940s +year: 1948 +domains: + - Computing + - Mathematics + - Information Theory + - Telecommunications +edges: [] +links: + - label: Wikipedia + url: https://en.wikipedia.org/wiki/A_Mathematical_Theory_of_Communication + - label: Original Paper (PDF) + url: https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf + - label: Bell System Technical Journal + url: https://ieeexplore.ieee.org/document/6773024 +--- + +"A Mathematical Theory of Communication" is a landmark paper by Claude Shannon published in the Bell System Technical Journal in July and October 1948. Often called the "Magna Carta of the Information Age," this paper founded the field of information theory and unified the understanding of all forms of communication. + +## Background + +Shannon wrote the paper while working at Bell Labs, where he had been since 1941. The telecommunications industry was grappling with fundamental questions: How much information could be transmitted through a channel? How could signals be protected from noise? Shannon's paper provided rigorous mathematical answers. + +Remarkably, Shannon was initially not planning to publish the paper, and did so only at the urging of colleagues at Bell Laboratories[1]. + +## Key Contributions + +### Information Entropy + +Shannon introduced the concept of information entropy—a measure of the uncertainty or information content in a message. He showed that information could be quantified using the formula: + +H = -Σ p(x) log₂ p(x) + +This measure, now called Shannon entropy, became fundamental to both information theory and thermodynamics. + +### The Bit + +The paper introduced and formalized the term "bit" (binary digit) as the fundamental unit of information[2]. Shannon credited John Tukey with coining the term, but it was Shannon who gave it precise mathematical meaning. + +### Channel Capacity + +Shannon proved that every communication channel has a maximum rate at which information can be transmitted reliably—the channel capacity. This theorem established fundamental limits that engineers had never known existed. + +### Source Coding Theorem + +Shannon demonstrated that data could be compressed to eliminate redundancy, up to a limit determined by the entropy of the source. This principle underlies all modern data compression. + +### Noisy Channel Coding Theorem + +Perhaps the most surprising result: Shannon proved that reliable communication is possible over noisy channels, as long as the transmission rate stays below the channel capacity. This theorem suggested that error-correcting codes could achieve arbitrarily low error rates—a result that seemed almost magical to engineers of the time. + +## Impact + +Historian James Gleick rated the paper as the most important development of 1948, placing the transistor second in the same time period, emphasizing that Shannon's paper was "even more profound and more fundamental" than the transistor[3]. + +The paper has tens of thousands of citations, being one of the most influential and cited scientific papers of all time. It gave rise to: + +- Modern telecommunications and data compression +- Error-correcting codes used in everything from CDs to space probes +- Cryptography and secure communications +- The theoretical foundations of the digital age + +Scientific American called it the "Magna Carta of the Information Age." + +--- + +## Sources + +1. Wikipedia. ["A Mathematical Theory of + Communication."](https://en.wikipedia.org/wiki/A_Mathematical_Theory_of_Communication) Notes that + Shannon published at colleagues' urging. +2. IEEE Information Theory Society. ["Claude E. + Shannon."](https://www.itsoc.org/about/shannon) Documents Shannon's introduction of the bit. +3. Quanta Magazine. ["How Claude Shannon Invented the + Future."](https://www.quantamagazine.org/how-claude-shannons-information-theory-invented-the-future-20201222/) + James Gleick's assessment of the paper's importance. diff --git a/src/content/works/symbolic-analysis-of-relay-and-switching-circuits.mdx b/src/content/works/symbolic-analysis-of-relay-and-switching-circuits.mdx new file mode 100644 index 0000000..c5988d3 --- /dev/null +++ b/src/content/works/symbolic-analysis-of-relay-and-switching-circuits.mdx @@ -0,0 +1,67 @@ +--- +id: symbolic-analysis-of-relay-and-switching-circuits +type: work +name: A Symbolic Analysis of Relay and Switching Circuits +kind: paper +era: 1930s +year: 1937 +domains: + - Computing + - Mathematics + - Electrical Engineering +edges: [] +links: + - label: Wikipedia + url: https://en.wikipedia.org/wiki/A_Symbolic_Analysis_of_Relay_and_Switching_Circuits + - label: Original Paper (PDF) + url: https://www.cs.virginia.edu/~evans/greatworks/shannon38.pdf + - label: MIT Archive + url: https://dspace.mit.edu/handle/1721.1/11173 +--- + +"A Symbolic Analysis of Relay and Switching Circuits" is Claude Shannon's 1937 master's thesis at MIT, widely considered the most important master's thesis of the 20th century. In it, Shannon demonstrated that Boolean algebra could be used to design and simplify electrical circuits, laying the theoretical foundation for digital computing. + +## Background + +In 1936, Shannon joined MIT as a research assistant working on Vannevar Bush's differential analyzer, the most advanced calculating machine of its time. While servicing the analyzer's relay circuits, Shannon recognized a connection between the two-valued logic of switches (on/off) and the symbolic logic of George Boole. + +Shannon had studied Boolean algebra in his mathematics courses at the University of Michigan, where he earned dual degrees in electrical engineering and mathematics in 1936. He developed his ideas during the summer of 1937 while working at Bell Telephone Laboratories[1]. + +## Key Insight + +The fundamental insight was elegant: electrical switches can represent logical values (1 = closed, 0 = open), and circuits of switches can compute logical functions. Boolean operations—AND, OR, NOT—correspond directly to series connections, parallel connections, and normally-closed relays. + +This meant that any logical function could be implemented in hardware, and Boolean algebra could be used to simplify circuit designs. Before Shannon, circuit design was largely an art; after Shannon, it became a science. + +## Significance + +The thesis transformed electrical engineering and computing: + +**Digital Circuit Design**: Shannon showed that the same mathematical techniques used in symbolic logic could optimize real circuits. Complex relay networks could be simplified using algebraic identities. + +**Foundation of Digital Computing**: The representation of logical true/false as electrical on/off became the basis of all digital computers. Every modern processor relies on principles Shannon articulated. + +**Hardware-Software Bridge**: By showing that logical operations could be implemented in circuits, Shannon connected abstract mathematics to physical machines—a connection central to computer science. + +## Reception + +Pioneering computer scientist Herman Goldstine described Shannon's thesis as "surely ... one of the most important master's theses ever written ... It helped to change digital circuit design from an art to a science"[2]. + +In 1985, psychologist Howard Gardner called it "possibly the most important, and also the most famous, master's thesis of the century"[3]. + +## Publication + +Shannon presented his work at the American Institute of Electrical Engineers (AIEE) Summer Conference in June 1938. The paper was published in the AIEE Transactions in December 1938 and won the Alfred Noble Prize. + +--- + +## Sources + +1. History of Information. ["Shannon's 'Symbolic Analysis of Relay and + Switching Circuits.'"](https://historyofinformation.com/detail.php?id=622) Details the context + and development of the thesis. +2. Wikipedia. ["A Symbolic Analysis of Relay and Switching + Circuits."](https://en.wikipedia.org/wiki/A_Symbolic_Analysis_of_Relay_and_Switching_Circuits) + Quotes Herman Goldstine on the thesis's significance. +3. Howard Gardner. "The Mind's New Science" (1985). Gardner's assessment + of the thesis as the most important of the century.