fundamental unit of information, a binary digit: 0 or 1

coined by Claude Shannon in his 1948 paper founding information theory

Shannon entropy: H = -sum(p * log2(p)), measured in bits

8 bits = 1 byte, the standard addressable unit of computer memory

kilobit, megabit, gigabit, terabit: network speeds measured in bits per second

a fair coin flip carries exactly 1 bit of information

quantum computing uses the qubit, a superposition of 0 and 1

data compression aims to reduce the number of bits needed to represent a message

all digital computation reduces to operations on bits: AND, OR, XOR, NOT

Local Graph