Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions scripts/optimize-images.js
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,9 @@
* with 85% JPEG quality to match consistent format.
*/

const sharp = require('sharp');
const fs = require('fs');
const path = require('path');
import sharp from 'sharp';
import fs from 'fs';
import path from 'path';

const IMAGES_DIR = './src/assets/images/entities';
const TARGET_WIDTH = 440;
Expand Down
Binary file added src/assets/images/entities/claude-shannon.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
70 changes: 70 additions & 0 deletions src/content/institutions/bell-labs.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
---
id: bell-labs
type: institution
name: Bell Labs
kind: laboratory
era: 1920s–present
location: Murray Hill, New Jersey, USA
domains:
- Computing
- Mathematics
- Telecommunications
- Electrical Engineering
edges: []
links:
- label: Wikipedia
url: https://en.wikipedia.org/wiki/Bell_Labs
- label: Official Site
url: https://www.bell-labs.com/
- label: Nokia Bell Labs
url: https://www.nokia.com/bell-labs/
---

Bell Telephone Laboratories (Bell Labs), founded in 1925, is one of the most prolific research laboratories in history. Originally the research arm of AT&T, it produced fundamental innovations in computing, communications, and physics that shaped the modern world.

## Nobel Prize Achievements

Bell Labs researchers have won nine Nobel Prizes in Physics, including:

- **1937**: Clinton Davisson for electron diffraction
- **1956**: William Shockley, John Bardeen, and Walter Brattain for the transistor
- **1978**: Arno Penzias and Robert Wilson for discovering cosmic microwave background radiation
- **1997**: Steven Chu for laser cooling of atoms
- **2009**: Willard Boyle and George Smith for the CCD sensor

## Computing Contributions

Bell Labs made foundational contributions to computing:

**Information Theory**: Claude Shannon developed information theory at Bell Labs, publishing "A Mathematical Theory of Communication" in 1948<sup><a href="#source-1">[1]</a></sup>. This work founded the mathematical study of communication and data transmission.

**The Transistor** (1947): Bardeen, Brattain, and Shockley invented the transistor, which replaced vacuum tubes and made modern electronics possible.

**Unix** (1969): Ken Thompson and Dennis Ritchie created Unix, which influenced virtually all modern operating systems.

**C Programming Language** (1972): Dennis Ritchie developed C, one of the most influential programming languages ever created.

**C++** (1979): Bjarne Stroustrup began developing C++, extending C with object-oriented features.

## Other Innovations

Bell Labs also produced:

- The laser (1958)
- Communication satellites
- Digital signal processing
- Error-correcting codes
- The photovoltaic cell

## Legacy

At its peak, Bell Labs employed 25,000 people. Though it has changed ownership multiple times (now part of Nokia), Bell Labs remains one of the most important research institutions in technology history.

---

## Sources

1. <span id="source-1"></span>IEEE. ["Claude Shannon."](https://www.itsoc.org/about/shannon)
Shannon's work at Bell Labs.
2. <span id="source-2"></span>Wikipedia. ["Bell Labs."](https://en.wikipedia.org/wiki/Bell_Labs)
History and achievements.
62 changes: 62 additions & 0 deletions src/content/institutions/massachusetts-institute-of-technology.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
---
id: massachusetts-institute-of-technology
type: institution
name: Massachusetts Institute of Technology
kind: university
era: 1860s–present
location: Cambridge, Massachusetts, USA
domains:
- Computing
- Mathematics
- Electrical Engineering
- Science
edges: []
links:
- label: Wikipedia
url: https://en.wikipedia.org/wiki/Massachusetts_Institute_of_Technology
- label: Official Site
url: https://www.mit.edu/
- label: CSAIL
url: https://www.csail.mit.edu/
---

The Massachusetts Institute of Technology (MIT), founded in 1861, is one of the world's most prestigious research universities. It has been central to the development of computing, producing foundational theories, influential researchers, and groundbreaking technologies.

## Computing Pioneers

MIT has produced and hosted numerous computing pioneers:

**Claude Shannon**: Wrote his revolutionary master's thesis at MIT (1937), showing that Boolean algebra could design digital circuits—the theoretical foundation of all modern computers<sup><a href="#source-1">[1]</a></sup>.

**Vannevar Bush**: Developed the differential analyzer (1931), an important analog computer that Shannon worked on. Later proposed the "Memex," a conceptual precursor to hypertext.

**Project MAC**: MIT's pioneering time-sharing research project (1963) developed Multics, which influenced Unix and modern operating systems.

## AI Laboratory

MIT's Artificial Intelligence Laboratory, founded by John McCarthy and Marvin Minsky in 1959, was one of the birthplaces of artificial intelligence research. The lab pioneered:

- Lisp programming language
- Natural language processing
- Computer vision
- Robotics

The AI Lab merged with the Laboratory for Computer Science in 2003 to form CSAIL (Computer Science and Artificial Intelligence Laboratory).

## Notable Contributions

- **TX-0 and TX-2**: Early transistorized computers at Lincoln Laboratory
- **Spacewar!**: One of the first video games, created at MIT in 1962
- **Project Athena**: Pioneering networked computing environment (1983)
- **World Wide Web Consortium (W3C)**: Co-hosted at MIT since 1994
- **One Laptop per Child**: MIT Media Lab initiative to provide affordable computers globally

---

## Sources

1. <span id="source-1"></span>MIT Museum. ["Claude Shannon."](https://mitmuseum.mit.edu/) Shannon's
thesis work at MIT.
2. <span id="source-2"></span>Wikipedia. ["MIT Computer Science and Artificial Intelligence
Laboratory."](https://en.wikipedia.org/wiki/MIT_Computer_Science_and_Artificial_Intelligence_Laboratory)
History of computing at MIT.
112 changes: 112 additions & 0 deletions src/content/people/claude-shannon.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,112 @@
---
id: claude-shannon
type: person
name: Claude Shannon
title: Father of Information Theory
era: 1930s–1990s
domains:
- Computing
- Mathematics
- Information Theory
- Cryptography
- Electrical Engineering
edges:
- target: a-mathematical-theory-of-communication
kind: influence
label: created
year: 1948
- target: symbolic-analysis-of-relay-and-switching-circuits
kind: influence
label: created
year: 1937
- target: massachusetts-institute-of-technology
kind: affiliation
label: studied at
year: 1936
- target: bell-labs
kind: affiliation
label: worked at
year: 1941
signatureWorks:
- a-mathematical-theory-of-communication
- symbolic-analysis-of-relay-and-switching-circuits
whyYouCare:
- Founded information theory, the mathematical study of communication that underlies all digital technology
- Proved that Boolean algebra could design digital circuits—the theoretical foundation of every computer
- Introduced the "bit" as the fundamental unit of information, giving us the language of the digital age
- Established fundamental limits on data compression and error-free transmission that engineers still work with today
- His work on cryptography during WWII helped secure Allied communications
links:
- label: Wikipedia
url: https://en.wikipedia.org/wiki/Claude_Shannon
- label: IEEE Information Theory Society
url: https://www.itsoc.org/about/shannon
- label: Britannica
url: https://www.britannica.com/biography/Claude-Shannon
- label: Wikimedia Commons
url: https://commons.wikimedia.org/wiki/Category:Claude_Shannon
image:
file: ../../assets/images/entities/claude-shannon.jpg
source: https://commons.wikimedia.org/wiki/File:ClaudeShannon_MFO3807.jpg
license: CC BY-SA 2.0 DE
author: Konrad Jacobs (1963)
---

Claude Elwood Shannon (1916–2001) was an American mathematician, electrical engineer, and cryptographer known as the "father of information theory." His work laid the theoretical foundations for the digital age, establishing the mathematical framework for communication, computation, and data storage that underlies virtually all modern technology.

## Early Life and Education

Shannon was born on April 30, 1916, in Petoskey, Michigan. His father was a businessman and judge; his mother was a language teacher who became principal of Gaylord High School. Growing up in Gaylord, Shannon showed an early aptitude for mechanical and electrical tinkering, building model airplanes, a radio-controlled boat, and even a barbed-wire telegraph system to a friend's house half a mile away.

In 1932, Shannon entered the University of Michigan, where he earned dual degrees in electrical engineering and mathematics in 1936—a combination that would prove prophetic for his later work bridging both fields.

## The Most Important Master's Thesis

Shannon moved to MIT in 1936 to work as a research assistant to Vannevar Bush on the differential analyzer, the most advanced calculating machine of the time. While servicing the machine's relay circuits, Shannon recognized that the two-valued logic of switches corresponded to George Boole's symbolic logic.

His 1937 master's thesis, "A Symbolic Analysis of Relay and Switching Circuits," demonstrated that Boolean algebra could be used to design and simplify electrical circuits<sup><a href="#source-1">[1]</a></sup>. This insight—that logical operations could be implemented in hardware—became the theoretical foundation of digital computing. Computer scientist Herman Goldstine called it "surely one of the most important master's theses ever written."

## Bell Labs and Information Theory

After completing his PhD in mathematics at MIT in 1940, Shannon joined Bell Labs in 1941. During World War II, he worked on cryptography and fire-control systems, including the secure communication system used by Roosevelt and Churchill.

In 1948, Shannon published "A Mathematical Theory of Communication" in the Bell System Technical Journal<sup><a href="#source-2">[2]</a></sup>. This paper founded information theory by:

- **Defining information mathematically** through the concept of entropy
- **Introducing the bit** as the fundamental unit of information
- **Proving the source coding theorem**, establishing limits on data compression
- **Proving the noisy channel coding theorem**, showing that reliable communication over noisy channels is possible up to a fundamental limit

Historian James Gleick rated the paper as the most important development of 1948—more significant even than the transistor. Scientific American called it the "Magna Carta of the Information Age."

## The Playful Genius

Shannon was known for approaching research with curiosity, humor, and play. At Bell Labs, he famously rode a unicycle through the hallways while juggling. His playful inventions included:

- **Theseus** (1950): An electronic mouse that could navigate a maze and learn from experience, an early demonstration of machine learning
- **Chess-playing machines**: Pioneering work that helped establish artificial intelligence
- **The Ultimate Machine**: A box whose sole function was to turn itself off when switched on

Shannon also wrote papers on juggling and designed a juggling robot, and he built a calculator that operated in Roman numerals.

## Later Career and Legacy

Shannon joined MIT's faculty in 1956 while maintaining ties to Bell Labs. He received numerous honors, including the National Medal of Science (1966), the Kyoto Prize, and the IEEE Medal of Honor.

Shannon developed Alzheimer's disease in his later years and died on February 24, 2001, at age 84. His achievements are often compared to those of Einstein, Newton, and Darwin.

Roboticist Rodney Brooks declared Shannon "the 20th century engineer who contributed the most to 21st century technologies." The AI language model Claude was named in his honor<sup><a href="#source-3">[3]</a></sup>.

---

## Sources

1. <span id="source-1"></span>Wikipedia. ["A Symbolic Analysis of Relay and Switching
Circuits."](https://en.wikipedia.org/wiki/A_Symbolic_Analysis_of_Relay_and_Switching_Circuits)
Details Shannon's master's thesis.
2. <span id="source-2"></span>Wikipedia. ["A Mathematical Theory of
Communication."](https://en.wikipedia.org/wiki/A_Mathematical_Theory_of_Communication) Overview
of Shannon's information theory paper.
3. <span id="source-3"></span>Quanta Magazine. ["How Claude Shannon Invented the
Future."](https://www.quantamagazine.org/how-claude-shannons-information-theory-invented-the-future-20201222/)
Assessment of Shannon's lasting impact.
81 changes: 81 additions & 0 deletions src/content/works/a-mathematical-theory-of-communication.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
---
id: a-mathematical-theory-of-communication
type: work
name: A Mathematical Theory of Communication
kind: paper
era: 1940s
year: 1948
domains:
- Computing
- Mathematics
- Information Theory
- Telecommunications
edges: []
links:
- label: Wikipedia
url: https://en.wikipedia.org/wiki/A_Mathematical_Theory_of_Communication
- label: Original Paper (PDF)
url: https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf
- label: Bell System Technical Journal
url: https://ieeexplore.ieee.org/document/6773024
---

"A Mathematical Theory of Communication" is a landmark paper by Claude Shannon published in the Bell System Technical Journal in July and October 1948. Often called the "Magna Carta of the Information Age," this paper founded the field of information theory and unified the understanding of all forms of communication.

## Background

Shannon wrote the paper while working at Bell Labs, where he had been since 1941. The telecommunications industry was grappling with fundamental questions: How much information could be transmitted through a channel? How could signals be protected from noise? Shannon's paper provided rigorous mathematical answers.

Remarkably, Shannon was initially not planning to publish the paper, and did so only at the urging of colleagues at Bell Laboratories<sup><a href="#source-1">[1]</a></sup>.

## Key Contributions

### Information Entropy

Shannon introduced the concept of information entropy—a measure of the uncertainty or information content in a message. He showed that information could be quantified using the formula:

H = -Σ p(x) log₂ p(x)

This measure, now called Shannon entropy, became fundamental to both information theory and thermodynamics.

### The Bit

The paper introduced and formalized the term "bit" (binary digit) as the fundamental unit of information<sup><a href="#source-2">[2]</a></sup>. Shannon credited John Tukey with coining the term, but it was Shannon who gave it precise mathematical meaning.

### Channel Capacity

Shannon proved that every communication channel has a maximum rate at which information can be transmitted reliably—the channel capacity. This theorem established fundamental limits that engineers had never known existed.

### Source Coding Theorem

Shannon demonstrated that data could be compressed to eliminate redundancy, up to a limit determined by the entropy of the source. This principle underlies all modern data compression.

### Noisy Channel Coding Theorem

Perhaps the most surprising result: Shannon proved that reliable communication is possible over noisy channels, as long as the transmission rate stays below the channel capacity. This theorem suggested that error-correcting codes could achieve arbitrarily low error rates—a result that seemed almost magical to engineers of the time.

## Impact

Historian James Gleick rated the paper as the most important development of 1948, placing the transistor second in the same time period, emphasizing that Shannon's paper was "even more profound and more fundamental" than the transistor<sup><a href="#source-3">[3]</a></sup>.

The paper has tens of thousands of citations, being one of the most influential and cited scientific papers of all time. It gave rise to:

- Modern telecommunications and data compression
- Error-correcting codes used in everything from CDs to space probes
- Cryptography and secure communications
- The theoretical foundations of the digital age

Scientific American called it the "Magna Carta of the Information Age."

---

## Sources

1. <span id="source-1"></span>Wikipedia. ["A Mathematical Theory of
Communication."](https://en.wikipedia.org/wiki/A_Mathematical_Theory_of_Communication) Notes that
Shannon published at colleagues' urging.
2. <span id="source-2"></span>IEEE Information Theory Society. ["Claude E.
Shannon."](https://www.itsoc.org/about/shannon) Documents Shannon's introduction of the bit.
3. <span id="source-3"></span>Quanta Magazine. ["How Claude Shannon Invented the
Future."](https://www.quantamagazine.org/how-claude-shannons-information-theory-invented-the-future-20201222/)
James Gleick's assessment of the paper's importance.
Loading
Loading