the ML word for learning — and where the analogy breaks
in ML, training is one-directional: data goes in, model weights come out. training ends, then inference begins. in cyber, every cyberlink is a weight update to the cybergraph, and learning and inference run continuously, interleaved. the graph is the model, and millions of neurons train it at once
training captures the write operation. it misses the observation loop that makes learning alive — see intelligence. see collective learning for the aggregate effect
discover all concepts