Form and Information in Biology—An Evolutionary Perspective

Engin Bermek

Foundations of Science

In this paper, I adopt the view that the form which is embodied in matter gives it its essence and converts it into substance (Aristotle). I furthermore understand information as the transmissible state of the form. Living beings as substances can create order in their environment adapted to their needs. The environment in turn has the potential to change the form and other causes such as matter, efficiency/functionality, and goal/intention. Living beings can internalize these changes, propagate them through replication, or share them as information with others. Living beings have progressively acquired through this process advanced form- and information-processing and generation abilities. This positive feedback loop with enhancement in form and information has become one of the main drivers of biological evolution. Based on these considerations, I will address the nature of form and information and the changes that they have undergone during biological evolution.

Read the full article at: link.springer.com

Structural Robustness and Vulnerability of Networks

Alice C. Schwarze, Jessica Jiang, Jonny Wray, Mason A. Porter

Networks are useful descriptions of the structure of many complex systems. Unsurprisingly, it is thus important to analyze the robustness of networks in many scientific disciplines. In applications in communication, logistics, finance, ecology, biomedicine, and many other fields, researchers have studied the robustness of networks to the removal of nodes, edges, or other subnetworks to identify and characterize robust network structures. A major challenge in the study of network robustness is that researchers have reported that different and seemingly contradictory network properties are correlated with a network’s robustness. Using a framework by Alderson and Doyle~\cite{Alderson2010}, we categorize several notions of network robustness and we examine these ostensible contradictions. We survey studies of network robustness with a focus on (1)~identifying robustness specifications in common use, (2)~understanding when these specifications are appropriate, and (3)~understanding the conditions under which one can expect different notions of robustness to yield similar results. With this review, we aim to give researchers an overview of the large, interdisciplinary body of work on network robustness and develop practical guidance for the design of computational experiments to study a network’s robustness.

Read the full article at: arxiv.org

What is entropy? by John C. Baez

Once there was a thing called Twitter, where people exchanged short messages called ‘tweets’. While it had its flaws, I came to like it and eventually decided to teach a short course on entropy in the form of tweets. This little book is a slightly expanded version of that course.
It’s easy to wax poetic about entropy, but what is it? I claim it’s the amount of information we don’t know about a situation, which in principle we could learn. But how can we make this idea precise and quantitative? To focus the discussion I decided to tackle a specific puzzle: why does hydrogen gas at room temperature and pressure have an entropy corresponding to about 23 unknown bits of information per molecule? This gave me an excuse to explain these subjects:
• information
• Shannon entropy and Gibbs entropy
• the principle of maximum entropy
• the Boltzmann distribution
• temperature and coolness
• the relation between entropy, expected energy and temperature • the equipartition theorem
• the partition function
• the relation between entropy, free energy and expected energy • the entropy of a classical harmonic oscillator
• the entropy of a classical particle in a box
• the entropy of a classical ideal gas.

Read the full book at: math.ucr.edu

Augmenting the availability of historical GDP per capita estimates through machine learning

Philipp Koch, Viktor Stojkoski, and César A. Hidalgo

PNAS 121 (39) e2402060121

The scarcity of historical GDP per capita data limits our ability to explore questions of long-term economic development. Here, we introduce a machine learning method using detailed data on famous biographies to estimate the historical GDP per capita of hundreds of regions in Europe and North America. Our model generates accurate out-of-sample estimates (R2 = 90%) that quadruple the availability of historical GDP per capita data and correlate positively with proxies of economic output such as urbanization, body height, well-being, and church building activity. We use these estimates to reproduce the reversal of fortunes experienced by southern and northern Europe and the historical role played by Atlantic ports. These findings show that machine learning can effectively augment the historical availability of economic data.

Read the full article at: www.pnas.org

Typicality, entropy and the generalization of statistical mechanics

Bernat Corominas-Murtra, Rudolf Hanel & Petr Jizba

EPJ B Volume 97, article number 129, (2024)

When at equilibrium, large-scale systems obey conventional thermodynamics because they belong to microscopic configurations (or states) that are typical. Crucially, the typical states usually represent only a small fraction of the total number of possible states, and yet the characterization of the set of typical states—the typical set—alone is sufficient to describe the macroscopic behavior of a given system. Consequently, the concept of typicality, and the associated Asymptotic Equipartition Property allow for a drastic reduction of the degrees of freedom needed for system’s statistical description. The mathematical rationale for such a simplification in the description is due to the phenomenon of concentration of measure. The later emerges for equilibrium configurations thanks to very strict constraints on the underlying dynamics, such as weekly interacting and (almost) independent system constituents. The question naturally arises as to whether the concentration of measure and related typicality considerations can be extended and applied to more general complex systems, and if so, what mathematical structure can be expected in the ensuing generalized thermodynamics. In this paper, we illustrate the relevance of the concept of typicality in the toy model context of the “thermalized” coin and show how this leads naturally to Shannon entropy. We also show an intriguing connection: The characterization of typical sets in terms of Rényi and Tsallis entropies naturally leads to the free energy and partition function, respectively, and makes their relationship explicit. Finally, we propose potential ways to generalize the concept of typicality to systems where the standard microscopic assumptions do not hold.

Read the full article at: link.springer.com