The information theory of individuality

By: David Krakauer, Nils Bertschinger, Eckehard Olbrich, Jessica C. Flack, Nihat Ay

In: Theory in Biosciences

Date: 24 March, 2020

Introduction

  • There is little agreement (in biology) about what individuals are.
    • Few rigorous quantitative methods for their identification.
  • Information theory approach to perceive and define individuals:
    • Individuals are aggregates that preserve a measure of temporal integrity.
    • They “propagate” information from their past into their futures.
    • Systems that are sufficient predictors of their own future.
  • Derivation of three principled and distinct forms of individuality:
    • An organismal, a colonial, and a driven form.
    • Depending on degree of environmental dependence and inherited information.
  • This approach allows you to expand the scope of individuals …
    • Multi-scale, highly distributed, and without physical boundaries.
  • … and investigate how adaptive (learning) systems emerge during evolution.
    • Organizing principle is maximizing the reduction of environmental uncertainty.
    • Nested structures emerge that compress information into “slower” variables that lead to better predictions (“coarse-graining”).

Biology and the need to define individuality

  • From the perspective of physics and chemistry, biological life is surprising.
    • Physics and chemistry are universal.
    • Biology may exclusively be a property of Earth.
  • There is an asymmetry between the certainty of what you learn from working down into a system’s components (Reduction) …
    • When you break down a system into ever more elementary constituents, nothing is left unaccounted for.
    • As in: break down a biological system, there is no unexplained chemistry or physics left.
  • … versus the uncertainty of working up through aggregation (Emergence).
    • Difficult to predict properties of aggregates from knowledge of constituents.
    • As in: no physical or chemical theory can predict biology.
  • So, when you want to explore biological ideas, you can’t start from the first principles of physics and chemistry.
  • Biological science requires picking a level or unit of analysis (individual).
    • To be able to talk about metabolism, behavior, the genome.
    • Human perceptual bias for certain kinds of analysis and aggregation.
  • But what constitutes an individual?
    • How far up or down do you go?
    • What principles do you use to explain aggregate properties?
  • Ideal case:
    • Perceiving and defining an individual relies on minimal prior knowledge.

Previous approaches to define individuality

  • Principles of replication and shared genetic ancestry.
    • Individuals increase in relative frequency by exploiting a source of metabolic free energy.
    • Individuals respond adaptively to environment.
    • Individuals have tightly coordinated relationships (chemical, physiological, computational) among their parts.
  • Principles of members (individuals) and complement (environment).
    • Separation of self and non-self.
    • Individual as a temporal aggregate encoding a common past separable from the past of other aggregates.
    • Individual as a spatially bounded collection of metabolic reactions insulated by a membrane from reactions in the environment.
    • Individual as the unit of selection and evolutionary change.
  • These approaches struggle to explain individuality at multiple organizational levels.
    • Ants versus ant colonies (only some ants reproduce, the colony as a whole adapts).
    • Viruses (can replicate, adapt, have a persistent identity, but rely on environment for replication).

A different approach: how to separate the figure and the background.

  • The background of an image carries as much, if not more, information as the object.
    • The challenge is to separate the two.
    • Rather than assume that they are already distinct and independent.
  • Individuality can be continuous.
    • Some processes possess greater individuality than others.
  • Individuality can emerge at any level of organization.
    • Find fundamental, rather that derivative, properties of individuality.
  • Individuality can be nested.
    • Individuals are information hierarchies.
    • Its components estimate regularities in the (fast-moving) environment.
    • Hierarchy of components compresses time series data into “slower” variables.
    • If slower variables better predict future than fast underlying components, new levels of organization can emerge.
    • Use perceived regularities to tune collective strategies.
  • So, individuals are best thought of in terms of dynamical processes.
    • Not as stationary objects that leave information-theoretic traces.

Information theory approach

  • Individuals.
    • Are aggregates that propagate information from the past to the future.
    • Have temporal integrity.
    • Meaning: uncertainty is reduced over time.
  • Entropy.
    • Initial framework: measures the energy lost from the total available energy available for performing work (Clausius, 1860s).
    • Then: measures the potential disorder in a system (number of unobservable microstates consistent with observed macrostate) (Boltzmann, 1877).
  • Shannon entropy in information theory:
    • Maximum number of states that can be transmitted from one point to another across a channel, in the face of noise.
    • A target word will be disordered during transmission in proportion to the noise in a channel.
    • Information is minimized when predictability is maximized.
      • High entropy = high information = many possible states = minimal predictability.
      • Low entropy = low information = outcome is known = maximal predictability.
      • [See “The User Illusion”: As the amount of entropy/disorder increases, more information is needed to describe a system (for instance, a sequence of 100 random numbers is more difficult to describe than a sequence of 100 zeroes). Information is directly linked with entropy, is similarly a measure of randomness/disorder and can be described as a measure of how surprised we are (there is more surprise in randomness than in order).].
    • Wide application:
      • Phone calls: increased entropy = less light pulses = bad reception.
      • DNA: increased entropy = mutations = altered phenotype.
  • If the information transmitted forward in time is close to maximal, evidence for individuality.
  • Defining properties:
    • Partitioning states into the system and its environment.
    • How does the current state (system or environment) determine the future state.
      • To what degree does the history of the system and/or the environment drive the future.
  • Three quantities corresponding to a type of individuality:
    • Colonial.
    • Organismal.
    • Environmental.
  • Quantified by measures in terms of:
    • Shared information: shared by system and environment (e.g., adaptive information).
    • Unique information: unique to either the system or the environment (e.g., memory in each).
    • Regulatory information: depends in some complicated way on both the system and the environment (e.g., regulatory information).
  • Organismal individuality maximizes.
    • Maximizes shared information and unique information of the system.
    • Need a large amount of private information required for effective function.
    • Adaptation through shared information about the environment in which they live.
  • Colonial individuality.
    • Maximizes regulatory information and unique information of the environment.
    • Environmentally regulated aggregations.
    • Share only a small amount of information with the environment.
    • Adaptation through regulatory mechanisms and interaction with the environment.
  • Degree of environmental determination.
    • When minimized, individual is not influenced by the environment and doesn’t adapt.
    • Unique environmental memory can be maintained by interacting with the system.
    • Environmental information can be encoded by inheritance of shared information (nature) or ongoing regulatory information interaction (nurture).

How do adaptive systems emerge?

  • Organizing principle of adaptive systems: maximizing reduction in environmental uncertainty.
    • Regular environment (patterns).
    • Construction of a nested process (hierarchy).
      • Compression of fast, microscopic dynamics into slow variables (computation).
      • Slow variables become better predictors of the future than underlying fast movements of components (learning, prediction).
    • [Application of collective strategy.]
    • [Random variation in replication?]
  • [This perhaps explains the first stage of self-organization: why do connections form among parts to form a system – see “At Home in the Universe“. From there, systems (and order) potentially grows, as they self-organize and become sufficiently robust to replicate reliably and adapt when needed.]

Leave a Reply