Discovering the Intelligence of Every Living Thing
By: Frank T. Vertosick Jr.
Intelligence is loosely defined as the general ability to store past experiences and use that knowledge to solve future problems.
The central idea is that all living systems (anything from a population of bacteria, to chemical reactions inside cells, to nerve cells in the brain) are intelligent and are able to learn, just not in exactly the same way or at the same pace.
A highly conceptual and speculative book, written by a neuroscientist. As in the “The User Illusion” , “Ubiquity” and “Linked“, network theory is put to good use, in this case to help explain biological phenomena.
The book does a great job in describing the similarities between different biological systems that display some type of intelligence, such as neurons in the brain, enzymes/chemical reactions in a cell, or lymphocites in an immunity system. The descriptions help shift your perspective away from intelligence as purely “brain intelligence” and show that all these networks work in much the same way, can be labeled similarly intelligent from a network perspective, and have a huge capacity to adapt and learn.
A very interesting critique of Dawson and the “selfish gene”. DNA contains the blueprint of connection weights among proteins. In the context of the evolution of the living system, any learning may appear random: the blueprint, the DNA, is only ever altered through mutation, ie random errors. But that is not where the learning, the adaptation takes place. Learning is implemented by way of competition among the parts of the system: parts compete for external inputs and are rewarded by a strengthening of connections (survival of the fittest: the fit get richer – also see “Linked“). The competition is therefor not among selfish genes (mutated versions of DNA), but among certain parts of the system (for instance, neurons in the brain) that are acting in a “selfish” manner as they compete for stronger connections. This is how a the random process of mutation (at the DNA level) can lead to non-random, adaptive and complex outcomes (at the network level); why network learning emerges from selfish neurons (not genes) employing the learning rules of competition. All biological learning is a local fitness competition operating on the connection weights of the network parts.
Inside the brain, random agitation of neurons caused by Brownian motion can be linked to internal thought (adjusting the connection weights in the absence of external inputs). This is kind of wild – a biological explanation for the process of associating, categorizing, adjusting weights. Perhaps this is a partial explanation of the origin of “random” thoughts and urges.
- No disruption, no learning.
- Disruption triggers adaptation: need for system to find a new balance.
- Process of adaptation: fitness competition among parts of the system.
- Results in learning: modified, stronger connections between the parts.
- Learning: the act of modifying connection weights in response to “experience”.
- Limits of reductionist analysis.
- Difficult to understand behavior of a system by analyzing its individual parts.
- Feedback loops and interaction make single, simple causes unlikely.
- Complex behavior emerges from even the simplest of ground rules.
- Helps to explain the conditions for living systems.
- Defines the different ways in which a system can be intelligent.
- Intelligent networks.
- Connected, non-linear input/output parts (for instance, neurons).
- Each part has at least two stable states (in the most simple case: on or off).
- Weight of the connections between parts determines the way information flows through a network.
- The act of modifying connection weights in response to “experience”.
- Learning rules.
- Determine how the weights change in response to environmental influences.
- Typically, Hebbian learning.
- Two parts are “on” at the same time -> connection strengthens.
- Living systems act in a non-linear way.
- Behavior of network parts is guided by feedback and interaction between the parts.
- Emerging behavior of the network as a whole is complex and difficult to understand.
- Even though “ground rules” may be simple.
- In some situations, connections are pre-determined or fixed.
- Rather than modified in response to external stimuli.
- For instance, in genes or the brain certain connection weights are “fixed”.
- Certain synaptic connections are hardwired in the gene.
- Leads to instinctive behavior such as generic motor capabilities.
- In bacteria, the ability to resist antibiotics is hardwired.
- In such situations, we speak of genetic learning.
- Most of the traits are past on from one generation to the next through more or less fixed genes.
- Any additional learning occurring through random mutations across generations (evolution).
- On the other hand, variable connection weights allow for more immediate, non-genetic, direct learning.
- Balance between genetic and non-genetic learning.
- Determined by factors such as replication speed and size of the network.
- Mammals: bad at genetic learning (slow reproduction) and good at non-genetic learning (large brains with many “hidden layers” of neurons that can alter their connection strengths).
- Community of bacteria: very good at genetic learning (through fast cycles of hyper mutation) and bad at non-genetic learning.
- Random mutations (noise) drive evolution and learning.
- Managing the signal to noise ratio determines the learning rate.
- High noise:
- More random associations and more learning.
- High energy cost.
- Children have high levels of noise and creative thinking.
- As their thinking and behavioral patterns get more fixed, adults have lower noise levels.
- High noise: