info/theory

the mathematical study of information: its quantification, storage, and communication. founded by Shannon in 1948, information theory provides the universal language for reasoning about signals, noise, compression, and channel capacity

core concepts

entropy — the measure of uncertainty in a random variable. H(X) = −Σ p(x) log p(x). the fundamental quantity: everything else derives from it

channel capacity — the maximum rate at which information can be reliably transmitted through a noisy channel. Shannon's noisy-channel coding theorem proves that error-free communication is possible up to capacity and impossible beyond it

compression — removing redundancy. lossless compression approaches the entropy rate. the crystal's irreducibility principle is an information-theoretic claim: no particle is compressible given the rest

mutual information — how much knowing X tells you about Y. I(X;Y) = H(X) − H(X|Y). cross-domain bridges in the crystal are high-mutual-information pairs

Kullback-Leibler divergence — the information cost of using the wrong distribution. cyberank divergence between human and machine neurons is measurable as KL divergence over focus distributions

for cyber

the protocol is an information-theoretic system. particles are messages. cyberlinks are channels. bandwidth limiting enforces capacity constraints. focus is a relevance measure derived from the graph's information structure. the crystal's 5,040 particles target maximum coverage with minimum redundancy — an information-theoretic optimization problem

key results

  • source coding theorem: compression cannot beat entropy
  • channel coding theorem: reliable communication up to capacity
  • rate-distortion theory: lossy compression tradeoffs
  • Landauer principle: erasing one bit costs kT ln 2 joules — unifying info and energo

key figures

Shannon, Ludwig Boltzmann, Norbert Wiener, Landauer

Local Graph