Month: April 2025

A Test for Life Versus Non-Life

Carl Zimmer

For generations, physicists have puzzled over life. Their theories about matter and energy have helped them understand how the universe produced galaxies and planets. But physicists have struggled to understand how lifeless chemical reactions give rise to the complexity stored in our cells.

In a new book, “Life as No One Knows It: The Physics of Life’s Emergence,” out on Aug. 6, Sara Walker, a physicist at Arizona State University, offers a theory that she and her colleagues believe can make sense of life. Assembly theory, as they call it, looks at everything in the universe in terms of how it was assembled from smaller parts. Life, the scientists argue, emerges when the universe hits on a way to make exceptionally intricate things.

The book arrives at an opportune time, as assembly theory has attracted both praise and criticism in recent months. Dr. Walker argues that the theory holds the potential to help identify life on other worlds. And it may allow scientists like her to create life from scratch.

“I actually think alien life will be discovered in the lab first,” Dr. Walker said in an interview.

Read the full article at: www.nytimes.com

Book Review of “Life as no one knows it: the physics of life’s emergence”

Hector Zenil

Sara Walker’s Life as no one knows it arrives on the heels of extensive media coverage and promotional efforts that have catapulted it into bestseller status. I approached this book with a sense of anticipation, especially eager to explore her ideas on algorithmic probability and open-endedness–topics we briefly worked on together [1]. These areas of research are foundational to understanding life’s complexity and origins, and I had expected Walker’s book to delve into these subjects with depth and originality.

However, the book surprised me for other reasons–and unfortunately, not in a positive way. Rather than presenting her own work, much of the book focuses on the ideas of Leroy (Lee) Cronin, a chemist whose assembly theory (AT) has met with significant skepticism and criticism in the scientific community. The central thesis of AT is that the ability of life to make numerous copies of itself–or to utilize multiple copies of the resources it requires–is the defining feature of living systems. This concept, quantified through an “assembly index,” proposes that life’s complexity can be reduced to the mere counting of these copies. Note that it has been considered and disproven many times.

Cronin’s theory specifically has been disproven by multiple research groups [2,3,4], and the scientific merit of its approaches remains highly questionable. Walker, rather than scrutinizing or distancing herself from these ideas, devotes much of her book to promoting them without acknowledging the criticisms and counter-evidence.

Read the full article at: www.computingreviews.com

Tissue-like multicellular development triggered by mechanical compression in archaea

THEOPI RADOS, et al.

SCIENCE 3 Apr 2025 Vol 388, Issue 6742 pp. 109-115

The advent of clonal multicellularity is a critical evolutionary milestone, seen often in eukaryotes, rarely in bacteria, and only once in archaea. We show that uniaxial compression induces clonal multicellularity in haloarchaea, forming tissue-like structures. These archaeal tissues are mechanically and molecularly distinct from their unicellular lifestyle, mimicking several eukaryotic features. Archaeal tissues undergo a multinucleate stage followed by tubulin-independent cellularization, orchestrated by active membrane tension at a critical cell size. After cellularization, tissue junction elasticity becomes akin to that of animal tissues, giving rise to two cell types—peripheral (Per) and central scutoid (Scu) cells—with distinct actin and protein glycosylation polarity patterns. Our findings highlight the potential convergent evolution of a biophysical mechanism in the emergence of multicellular systems across domains of life.

Read the full article at: www.science.org

Escalation dynamics and the severity of wars

Aaron Clauset, Barbara F. Walter, Lars-Erik Cederman, Kristian Skrede Gleditsch

Although very large wars remain an enduring threat in global politics, we lack a clear understanding of how some wars become large and costly, while most do not. There are three possibilities: large conflicts start with and maintain intense fighting, they persist over a long duration, or they escalate in intensity over time. Using detailed within-conflict data on civil and interstate wars 1946–2008, we show that escalation dynamics — variations in fighting intensity within an armed conflict — play a fundamental role in producing large conflicts and are a generic feature of both civil and interstate wars. However, civil wars tend to deescalate when they become very large, limiting their overall severity, while interstate wars exhibit a persistent risk of continual escalation. A non-parametric model demonstrates that this distinction in escalation dynamics can explain the differences in the historical sizes of civil vs. interstate wars, and explain Richardson’s Law governing the frequency and severity of interstate conflicts over the past 200 years. Escalation dynamics also drive enormous uncertainty in forecasting the eventual sizes of both hypothetical and ongoing civil wars, indicating a need to better understand the causes of escalation and deescalation within conflicts. The close relationship between the size, and hence the cost, of an armed conflict and its potential for escalation has broad implications for theories of conflict onset or termination and for risk assessment in international relations.

Read the full article at: arxiv.org

Uncertainty quantification and posterior sampling for network reconstruction

Tiago P. Peixoto

Network reconstruction is the task of inferring the unseen interactions between elements of a system, based only on their behavior or dynamics. This inverse problem is in general ill-posed, and admits many solutions for the same observation. Nevertheless, the vast majority of statistical methods proposed for this task — formulated as the inference of a graphical generative model — can only produce a “point estimate,” i.e. a single network considered the most likely. In general, this can give only a limited characterization of the reconstruction, since uncertainties and competing answers cannot be conveyed, even if their probabilities are comparable, while being structurally different. In this work we present an efficient MCMC algorithm for sampling from posterior distributions of reconstructed networks, which is able to reveal the full population of answers for a given reconstruction problem, weighted according to their plausibilities. Our algorithm is general, since it does not rely on specific properties of particular generative models, and is specially suited for the inference of large and sparse networks, since in this case an iteration can be performed in time O(Nlog2N) for a network of N nodes, instead of O(N2), as would be the case for a more naive approach. We demonstrate the suitability of our method in providing uncertainties and consensus of solutions (which provably increases the reconstruction accuracy) in a variety of synthetic and empirical cases.

Read the full article at: arxiv.org