entropy

A measure of the number of microscopic configurations consistent with a macroscopic state — quantifying disorder and missing information theory.

  • second law of thermodynamics: entropy of an isolated system never decreases
  • defines the arrow of time: the direction in which entropy increases
  • Boltzmann entropy: S = k_B ln W, where W is the number of accessible microstates
  • Shannon entropy: the information-theoretic analog — measures uncertainty in bits, bridging physics and information theory
  • maximum entropy at thermal equilibrium — see temperature and thermodynamics
  • black hole entropy (Bekenstein-Hawking): proportional to horizon area, linking gravity, quantum mechanics, and information theory
  • free energy = energy minus temperature times entropy — governs spontaneous processes
  • low-entropy initial conditions of the universe are a central puzzle in cosmology