The evolution of order, from atoms to economies
By: Cesar Hidalgo
Published: 2015
Read: 2018
Summary:
The concept of entropy dictates that disorder, randomness and chaos should increase over time. Yet, in the universe we see pockets of growing order, in the form of stable and increasingly complex biological, social and economic systems and mechanisms.
How and why does order grow despite the universe’s tendency towards disorder?
The author analyses how energy, matter and computation combine to produce emerging order and how this emergent order drives the accumulation of knowledge and knowhow in individuals and complex social and economic networks.
Worth Reading:
The theoretical part is thorough, systematic and a great read. The practical application of the framework mostly results in describing things that are fairly obvious (the limits to an individual’s knowledge accumulation, the need for and cost of complex networks, etc.) and the implications for industrial structure are perhaps more interesting to macro-economists.
I would have been more interested if the book had continued building out its theoretical framework.
Key Takeaways:
- Order needs constant energy (to emerge), the right type of matter (to last) and predictable computation (to grow).
- Knowledge and knowhow refer to the ability of a system to process information.
- Knowledge is the ability to predict the outcome based on an understanding of relationships / linkages.
- Knowhow involves the capacity to perform actions.
- Accumulated knowledge and knowhow are used to crystallize information in the form of new objects and products that augment society, drive growth and complexity.
- The ability of a system to accumulate knowledge and knowhow and grow information, to become more complex, is constrained by the computational capacity of the system (an economy) and the elements making up the system (humans, companies, networks).
- Limits to accumulating knowledge and knowhow:
- In individuals: learning is experiential (we learn by doing) and social (we learn from others).
- In networks: transaction costs, social bonds (trust).
- Diversity and complexity of a country’s industry is a strong predictor of its future growth.
Key Concepts:
Information is the arrangement of physical order. It is embodied in things (messages, arrangements of atoms): it is the particular state of something.
Information does not imply “meaning” in this framework (as in the colloquial use of the term information). Information only designates a particular state of something; meaning is derived by an agent through the interpretation of that state by looking at context, by association and through prior knowledge.
There are many states that things can be in varying from randomness (gas) to a simple fixed state (ice) to a complex order (DNA).
Information can then refer to the degree of order, the degree of complexity. Highly ordered states are rare, because there are many more ways for something to be random than ordered (think of the many possible random configurations of Rubik’s cube vs. only one state where it is “solved”). Highly ordered states involve uncommon, correlated configurations that are difficult to arrive at. [But comfortable to remain in: see “The Vital Question“]
As time goes by, the universe moves from rare (ordered) configurations to common (more random) ones, the entropic march from order to disorder driven by chance.
[Comment: this book argues that entropy and information are opposites. Entropy is typically thought of as a measure of disorder, which is shorthand for saying that entropy measures the (average) number of states that a system can be in (molecules in a gas can be in many different states, so they are high on entropy; a solid piece of ice has only one possible arrangement and is therefore low on entropy). In nature, entropy increases over time, as there are many possible paths that lead away from order, and information is rare because there are far fewer paths that lead to order. This is possibly the opposite of classic information theory, which I think states that information grows as entropy grows, because information is the minimum amount needed to describe the state of a thing and it is much more difficult to describe a random pattern than an ordered one. Perhaps where this book talks about information (an increase in order), it would have been better to use a different label? Maybe something related to complexity?].
So information [as it is defined here as an increase in order] is rare and difficult to construct.
So where does order, information [complexity] come from? The three main mechanisms are (i) it emerges from out-of-equilibrium systems, (ii) it lasts through the accumulation of information in solids and (iii) it grows due to the ability of matter to compute.
First, order emerges naturally in situations where a physical system that is out of equilibrium is consistently fed inputs/energy. The physical system must have many individual connected parts and have access to a constant flow of energy, allowing the system to self-organize into a steady state (think of a whirlpool that shows up as you drain a bath)
[Comment: The requirements are spelled out in more detail elsewhere: you need interacting parts, a certain scale, multiple potential states, continuous inputs, non-equilibrium – under the right conditions, the system will find a steady, optimal state on the border of chaos and order, where it can both adapt to changing conditions and maintain much of its state].
Your body is an out-of-equilibrium system. It needs a constant flow of energy to maintain itself in a steady state.
Second, order lasts (a whirlpool doesn’t last…) only when it is sticky, when it is captured in a solid (DNA does last). The solid matter needs to have a degree of structure – information can’t grow in solids that are too rigid or too fluid. It needs to be solid and rigid enough to retain information, but flexible enough to adapt when needed. Again, think of your own body, its cells, etc.
Third, order / information grows when it is processed. Matter processes information through computation – certain inputs lead to certain outputs (what happens in cells, inside organs, “life”, eco-systems). These input-output processes over time lead to the accumulation of knowledge and knowhow.
Both knowledge and knowhow are capacities related to computation and refer to the ability of a system to process information. Knowledge is the ability to predict the outcome based on an understanding of relationships / linkages [I guess knowledge is one of the components necessary for intelligence / intelligent behavior: the ability to do the right thing at the right time]. Knowhow involves the capacity to perform actions.
Items 1, 2, and 3 neatly combine into needing “Energy” in step 1, “Matter” in step 2, and “Computation” in step 3 – all three are needed for order to emerge, last and then grow.
Humans use accumulated knowledge and knowhow to crystallize information in the form of new objects and products. These products augment individual capacities, share knowledge and knowhow among agents, and, through diversity and combination, drive increasing complexity. These knowledge products form the basis on which societies and economic systems are built.
The ability of a system to accumulate knowledge and knowhow and grow information, to become more complex, is constrained by the computational capacity of the system (an economy) and the elements making up the system (humans, companies, networks).
At the individual level, accumulating knowledge and knowhow is difficult and limited because learning is experiential (we learn by doing) and social (we learn from others).
At the network level (firm or network of firms or economy level), accumulating knowledge and knowhow and the limits to growth are driven by transaction costs – when it becomes more costly to do things internally than externally, networks outsource and stop learning. The easier it is to learn something, the cheaper the internal links within the network, the larger and more complex the network can become. As costs come down (transport, technology) over time, networks become larger and more complex.
Economic costs are not the only factors limiting the accumulation of knowledge and knowhow in networks. Social constructs such as “trust” help lower transaction costs and allow for larger, market-based networks (“low-trust” societies, in contrast, don’t rely on markets and rely more heavily on family-based networks). Social relationships often predate economic relationships, help form economic networks and allow such networks to adapt more easily (as the boundaries between organizations are more porous and information flows more freely – see Silicon Valley).
The interplay between the limitations of individuals and organizations to accumulate knowledge and knowhow determines how and where industries locate and grow. One can then analyze the resulting variety and complexity of a country’s industrial networks, which can be used as an indicator of future expected growth.
Both biological (cells, organisms) and economic networks demonstrated the “nestedness” of information, with knowledge and knowhow accumulated and contained in discrete levels due the limitations at each level of the system.