Introduction

There are several types of information quantities involved in pattern formation processes, and they are typically of different orders of magnitude. A first piece of information when one is presented with a specific chemical system is the selection of molecules involved, both those that are present in the system and those that are allowed to flow over the system border. All this can loosely be referred to as genetic information, i.e., information on which components and thus also which processes that will be part of the self-organising chemical system. The amount of this information is not vary large, as is reflected for example by the size of genomes in living organisms with the order of 104 genes.

 

In biological systems all necessary information is not genetically encoded, but there is also compositional information in the transfer of chemicals and structures from the mother to the daughter cells. There are proposals expressing the idea that this type of information may have played an important role in the origin of life (Segré et al, 2000).

 

Another type of information enters when a specific self-organising system starts to develop a pattern. The typical form of the pattern may be determined by the reaction scheme involved, but in many cases fluctuations in concentrations or other disturbances may affect the exact pattern that is formed. An example of that is the difference in finger prints between identical twins. One may view this as an information flow from fluctuations to the actual pattern that is observed. This flow is of the same character as the flow from micro to macro that we find in chaotic systems (Shaw, 1981). This information flow can characterised by the Lyapunov exponent, an important quantity for the analysis of chaotic dynamical systems. This perspective will be brought up in the next Chapter presenting an information-theoretic perspective on low-dimensional chaotic systems.

 

The focus of the approach presented here is a third type of information quantity, the information capacity in free energy or exergy, based on an information-theoretic formalism related to statistical mechanics. This is then combined with geometric information theory. The starting point is the free energy of a chemical system, involving both the deviation from homogeneity when a spatial pattern is present and a deviation from equilibrium (when the system is stirred). This free energy is expressed as the total information of the system, and in our approach we decompose this into information contributions from both different positions and different length scales. The connection with thermodynamics then allows us to view the inflow of free energy, due to the fact that the system is open to an inflow of a fuel and outflow of waste products, as an inflow of information capacity. This inflow allows for an accumulation of information in the system when a pattern is formed. Entropy production due to chemical reactions and diffusion leads to desctruction of information – an information loss that can be balanced by the information capacity inflow to maintain the chemical pattern. The following presentation builds on work previously published in (Eriksson and Lindgren, 1987; Eriksson et al, 1987) which applied to closed chemical systems, which was later extended to open reaction-diffusion systems (Lindgren et al, 2004). For a detailed presentation of the information-theoretic background see (Lindgren, 2008).

References:
- Eriksson, K.-E. and K. Lindgren (1987). Structural information in self-organizing systems, Physica Scripta 35, 388-397.
- Eriksson, K.-E., K. Lindgren and B. Å. Månsson (1987). Structure, Context, Complexity, Organization, World Scientific, Singapore.
- Segré, D., D. Ben-Eli, and D. Lancet (2000). “Compositional genomes: Prebiotic information transfer in mutually catalytic noncovalent assemblies.” PNAS 97, 4112-4117.
- Shaw, R. (1981). Strange attractors, chaotic behavior, and information flow", Zeitschrift für Naurforschung 36a, 80.