the mathematical study of quantifying, storing, and communicating information
founded by Claude Shannon in 1948. defines information as surprise: the less probable a message, the more information it carries
provides the theoretical foundation for several core cyber measures:
- syntropy — the negentropy of the cybergraph, measuring how much structure neurons have created
- focus distribution entropy — how evenly attention spreads across particles
- KL divergence — quantifies the difference between prior and posterior relevance distributions in cyber/epistemology
key concepts: entropy, mutual information, channel capacity, source coding, rate-distortion
information theory connects thermodynamics to computation. entropy in physics and entropy in communication are the same quantity measured in different units. this bridge grounds cyber's syntropy metric in physical law
Shannon's noisy channel theorem guarantees reliable communication at rates below capacity — the theoretical limit that shapes how neurons can compress knowledge into cyberlinks
see syntropy, information, focus, cyber/epistemology, computation