the mathematical study of quantifying, storing, and communicating information

founded by Claude Shannon in 1948. defines information as surprise: the less probable a message, the more information it carries

provides the theoretical foundation for several core cyber measures:

key concepts: entropy, mutual information, channel capacity, source coding, rate-distortion

information theory connects thermodynamics to computation. entropy in physics and entropy in communication are the same quantity measured in different units. this bridge grounds cyber's syntropy metric in physical law

Shannon's noisy channel theorem guarantees reliable communication at rates below capacity — the theoretical limit that shapes how neurons can compress knowledge into cyberlinks

see syntropy, information, focus, cyber/epistemology, computation

Local Graph