Much of biology assumes and utilizes a concept of individuality. In evolutionary theory, for example, populations are composed of Darwinian individuals that reproduce with differential success to create more individuals in the next generation. In fact, some of the most notable events in evolutionary history, like the transition from unicellular to multicellular organisms, are written about as transitions of individuality. However, different fields of biology define individuals in different ways. In evolutionary theory, the definition is based on replicative units. However, in cell biology and organismal physiology, individuality is defined in terms of properties like having a metabolism or having various internal control mechanisms that contribute to an agentic whole. In this case, the question, “what is a biological individual” is addressed in terms of “what makes for a functioning unit.” Another perspective on individuality, common in behavioral and cognitive sciences, is to identify the decision making system or the “agent”. For example, in the study of social insects, colonies are often assumed to operate as “superorganisms” because they behave intelligently at that level. When a colony of ants comes to consensus about what resources to extract or nest site to build, we often describe this as the colony’s “choice”. Is the colony an individual? If we are looking for signs of agency, then an ant colony has a valid claim to individuality based on features like sharing a resource pool and division of labor. All in all, the art of finding individuals is in observing system-environment distinctions.

Image of an ant traversing a vast underground nest [1]
While philosophical discussions of individuality in biology are already very sophisticated, there is no quantitative framework with which to bring all of the use cases under a common interpretation. For as long as we rely on derivative properties of a system rather than a common fundamental definition of individuality then we are running the risk of only identifying those that are easiest to find. Furthermore, all the definitions I have presented are spatial in that they refer to physical attributes like a cell membrane or division of labor as evidence of individuality, although all definitions also implicitly require that these ordered states persist in time. Individuality has a strong temporal component.
For this reason, a group of researchers from the Santa Fe Institute and the Max Planck Institute for Mathematics in the Sciences in Leipzig contend that information theory provides a more rigorous foundation for a unifying concept of individuality. In their paper, “The information theory of individuality”, they present a mathematical formalism for identifying “aggregates that ‘propagate’ information from the past to the future and have temporal integrity.” The information theoretic measures act like “lenses” that can be applied to data to find candidate individuals that maintain their information over time in distinction from their environment. Using information theory to define individuality has three important consequences for our assumptions: (1) individuality is a continuous variable such that some systems can be more or less well-individuated than others, (2) individuality can emerge at any level of organization such that no specific scale is privileged like the replicating organism, and (3) individuality is nested such that the hierarchical property of life is retained. First, I will describe the necessary information theoretic tools. Claude Shannon, in his famous paper “A Mathematical Theory of Communication,” defined the Shannon entropy function,
where si are the possible outcomes of the random variable S and P(si ) is the probability of state si being observed when a measurement is made. The function H(S) maps a random variable to an average measure of the “uncertainty” or “information” inherent in the possible outcomes. This can be confusing so I’ll walk through the simple example of a coin flip such that s1 is Heads and s2 is Tails. Consider first a maximally biased coin with P(s1 )=1 and P(s2 )=0, such that the coin always lands on Heads. You would expect, when flipping the coin, that no information is gained by observing the outcome because there is no uncertainty at all inherent to the distribution. In this case,Now consider an unbiased coin with P(s1 )=0.5 and P(s1 )=0.5 such that heads and tails occur with equal likelihood. In this case, you’d expect to receive the maximum information from observing the results of a coin flip because there is the maximum amount of uncertainty in the outcome. Here,Thus, we can say there is one “bit” of information inherent in observing the outcome of a fair coin flip. The other information theoretic quantity necessary is the “mutual information” between two random variables, I(S;R). This quantity measures the amount of information gained about one random variable by making observations of the other, thus capturing the mutual dependence between S and R. Imagine that S is a random variable for how rainy a given day will be and R is a random variable for how sunny. Now, if you are told that it is raining heavily then you would expect that it will not be a sunny day. Then we could say that S and R have some information in common and knowing the state of one reduces the uncertainty in the other. I will not walk through the derivation or specifics of this function but I will show its most convenient form, This says that the mutual information between random variables S and R is the difference between the entropy of R and the conditional entropy of R given S. Just as H(R) measures the uncertainty in R, the conditional entropy, H(R|S), measures the amount of uncertainty about R that remains if you know S. Hence, the mutual information is maximized when H(R|S) is minimized. If R can be perfectly predicted from S then H(R|S) = 0 and I(S;R) is maximal. Now I can introduce the structure of their formalism. Consider a system and environment interacting in discrete time. In the figure below, you can see how the system S and environment E influence each other at each time step. The state of the system at one time depends on its prior state and its interaction with the environment in the prior step. Likewise, through its behavior, the system influences the future state of the environment. We can represent the system and the environment at time n probabilistically as random variables Sn and En , respectively. Each possible outcome of the distribution corresponds to a particular state of the system or environment.
Causal diagram of system-environment interaction [2]
The authors use mutual information measures and their causal diagram to quantify the information flows between system and environment. The predictability of a future state Sn+1 based on prior states of system and environment is given by I(Sn,En;Sn+1). This quantity can be decomposed in two different ways to put more weight on either the system or the environment in determining the future states of the system:
The first decomposition, roughly speaking, is useful for describing a self-determined system. The quantity, I(Sn+1;Sn) acts like a measure of “autonomy” because it is high when most of the information determining the system’s future state is contained in its previous state. Then, the quantity I(Sn+1;En|Sn) is the remaining information flowing into the system from the environment. Conversely, the second decomposition highlights environmentally driven systems. The first term, I(Sn+1;En), measures the amount of information in the system’s future state that comes from the environment and I(Sn+1;Sn|En) measures the remaining influence of the system’s prior state to its future state. The quantity I(Sn+1;Sn|En) is an alternative measure of “autonomy” if one assumes that the system’s functioning depends greatly on the environment. The terms of this decomposition are used to distinguish two different “lenses” best for finding different “types” of individuals. The first, Organismal Individuality, is given by I(Sn+1;Sn) and it works well for individuals like cells and multicellular organisms who have very well adapted internal dependencies and a bounded structure. In other words, this measure is used when the system-environment distinction is clear and the system itself uses predictions of environmental states to guide its behavior. In this class fall animals like us humans because our future state depends mostly on our prior states, we have strong control over our environments through our tools, and we are constantly anticipating the future in deciding how to behave. Secondly, they define Colonial Individuality as measured by I(Sn+1;Sn|En). The authors describe this lens as picking up “environmentally regulated aggregates.” As mentioned previously, colonial individuality is an alternative measure for autonomy because it measures the remaining influence from the system given the environment. For this reason, colonial individuality is more sensitive to “individuals” that we have reason to believe depend mostly on their ongoing interaction with the environment to regulate their state. Social insects are quintessential examples of this kind of individual as colonies behave in tight feedback with their environment. Their behavior at one moment affects the environment and in the next moment the colony is responding to that alteration. During foraging, individual ants function like sensors of the colony searching for food and, upon finding some, lay down pheromones to the environment as a consensus mechanism allowing the colony to extract the resource efficiently. Informationally speaking, the most important factor determining foraging behavior is the state of the environment. Pheromone laying acts like a distributed memory that exists in the environment rather than in the system of ants themselves. The goal of this research is to provide a unifying framework for individuality in biology and to uncover different kinds of “individuals” empirically. In the case of social insects more specifically, the degree of collectivity seems to vary remarkably between species from solitary behavior to eusociality and everything in between. Under this informational framework, each species might be measured to have different degrees of organismal and colonial individuality. With an informational definition of individuality, we might more easily understand how individuality at different scales interacts. For example, to what degree are the members of a colony more individuated than the colony itself? However, it should not be expected that this is the final word on the subject of individuality, because an informational framework also has drawbacks. One problem is that even abiotic aggregates like hurricanes will register some information propagation into the future. All that said, I believe this work has the potential to help us find characteristic scales of organization in biology from within cells to ecosystems. If society has any hope of existing in a sustainable relationship to the rest of nature, we must have sophisticated means with which to make system-environment distinctions and understand their interdependencies.
Further Reading
Booth, Austin Greeley. “Essays on Biological Individuality.” DASH Home, 2014, dash.harvard.edu/handle/1/13070056.
Cepelewicz, Jordana. “What Is an Individual? Biology Seeks Clues in Information Theory.” Quanta Magazine, www.quantamagazine.org/what-is-an-individual-biology-seeks-clues-in-information-theory-20200716/.
Krakauer, David, et al. “The Information Theory of Individuality.” Theory in Biosciences, vol. 139, no. 2, 2020, pp. 209–223., doi:10.1007/s12064-020-00313-7.
Shannon, C. E. “A Mathematical Theory of Communication.” Bell System Technical Journal, vol. 27, no. 3, 1948, pp. 379–423., doi:10.1002/j.1538-7305.1948.tb01338.x.
West, Stuart A., et al. “Major Evolutionary Transitions in Individuality.” PNAS, National Academy of Sciences, 18 Aug. 2015, www.pnas.org/content/112/33/10112.
Wilson, Robert A., and Matthew J. Barker. "Biological Individuals.” Stanford Encyclopedia of Philosophy, Stanford University, 21 June 2019, plato.stanford.edu/entries/biology-individual/.
Media Credits
[1] Photo by Alex Wild. https://www.alexanderwild.com/Ants/Taxonomic-List-of-Ant-Genera/Lasius/i-Rv6Jr2L/buy
[2] Image from Krakauer et al. (2020), Figure 1. The Information Theory of Individuality
No comments:
Post a Comment