- 1916-2001. American mathematician and electrical engineer.
- Founded information theory with “A Mathematical Theory of Communication” (1948).
- Defined the bit as the fundamental unit of information.
- Introduced entropy as a measure of information content and uncertainty.
- Established channel capacity and the noisy-channel coding theorem, the theoretical ceiling of digital communication.
- Connected thermodynamics and information theory, bridging physics and computation.
- His framework underlies every protocol that transmits, compresses, or encrypts data, including cyber.