inverse variance of a prediction error — how confident an agent is about a particular signal

in active inference: precision determines which prediction errors get amplified and which get suppressed. high precision = this signal is reliable, weight it heavily. low precision = this signal is noisy, down-weight it

attention in the Fristonian framework IS precision-weighting: attending to something means increasing the gain on prediction errors from that source

in cyber

precision maps to token staking in the cybergraph:

  • high stake on a cyberlink = high precision = the neuron is confident this connection is real
  • low stake = low precision = uncertain, tentative link
  • staking amplifies the signal in the tri-kernel computation — precisely the gain modulation that precision provides in brains

this makes precision an economic signal: backing beliefs with value. gaming precision (staking heavily on false connections) is punished by slashing — skin in the game

the precision-attention equivalence

predictive coding cyber
increase precision on a sensory channel stake more tokens on a particle or cyberlink
suppress low-precision errors low-stake links contribute less to π
attention = selective precision focus = stake-weighted attention distribution

see active inference for the framework. see free energy principle for the theory. see predictive coding for the neural architecture

Local Graph