1. Mathematics as the Language of Nature
At the very beginning of modern science, in the XVII century, Galileo Galilei, in his book
Il Saggiatore (The Assayer) [
1], wrote:
“Natural Philosophy is written in this great book that is constantly open before our eyes (I say the universe), but it cannot be understood without first learning to understand the language and to know the characters in which it is written. It is written in mathematical language, and the characters are triangles, circles, and other geometrical figures, without which it is impossible to humanly understand a single word; without these, it is a vain wandering in a dark labyrinth”.
Who wrote this book? Galileo was convinced that God was the author. In the subsequent centuries, other possibilities were proposed, but what remained almost universally accepted among scientists was that mathematics is the “natural” language of science. This conviction shaped the trajectory of scientific progress: different fields adopted mathematics to different degrees, from the a
posteriori statistical evidence of biological phenomena to the abstract formalisms of theoretical physics, yet always under the same guiding principle that natural phenomena could be described once translated into a formal mathematical syntax. The key implication of this view was the rigorous separation between ‘syntax’ and ‘semantics’. Mathematics was treated as a neutral syntax, a ‘formal system of signs and rules’ that was capable of producing ‘logically consequent statements’ independent from meaning, while semantics, the meaning and function of natural phenomena, was seen as an independent reality. Nature became legible only after being recast into mathematical syntax, and meaning was something derived afterwards. This methodological stance was extraordinarily powerful, which allowed prediction, generalization, and mechanistic explanation, but at the same time, it constrained the way science conceived of the relationship between matter and meaning. It reinforced the idea that matter itself was mute, waiting for external syntax to give it intelligibility [
2].
2. Reduction of Semantics to Syntax in Science
The reduction of
semantics to
syntax became particularly pronounced in the first half of the 20
th century. It was driven by two converging developments: the advent of automatic computation and the discovery of the genetic code. The computational paradigm transformed continuous mathematical entities into discrete sequences of ‘0’s and ‘1’s, reducing complex phenomena to binary operations. In parallel, the diversity of biological forms and functions was increasingly interpreted as the ‘readout’ of discrete sequences encoded in the four nucleotides of DNA. This perspective suggested a deterministic mapping from genotype to phenotype: the sequence of characters, in principle, entirely generated the observable spectrum of biological phenomena. Within this framework, the distinction between syntax and semantics appeared blurred: the DNA sequence (syntax) was assumed to contain all necessary instructions to produce the features of life (semantics) [
2], while the unfolding of these features could be explained entirely through mechanistic and stochastic principles such as mutation and natural selection. ‘Chance’, filtered by the survival of the fittest principle, became the author of ‘the great book’, and living systems were perceived as the passive expressions of prewritten sequences.
Yet this reductionist view was challenged even before the completion of the Human Genome Project. It became increasingly evident that a deterministic one-to-one relation between genotype and phenotype does not exist; the same genotype can give rise to multiple phenotypes depending on environmental, epigenetic, and stochastic factors. While biologists and clinicians had long recognized this complexity, it was largely ignored by the dominant geno-centric paradigm [
3]. The post-genomic era has therefore underscored the irreducibility of life to DNA sequences alone: phenotype emerges not solely from a static code but from the dynamic interplay between sequences, molecular networks, cellular contexts, and environmental inputs.
This historical shift in biology parallels developments in artificial intelligence (AI). In the late 20
th century, AI research experienced a profound transition from the notion of intelligence as a strict symbolic manipulation of discrete rules, the so-called “Symbolic AI”, to the connectionist, sub-symbolic paradigm. Here, intelligence is no longer imposed externally through syntax, but emerges spontaneously from the interactions among numerous simple processing units. As Paul Smolensky somewhat prophetically noted [
4],
“mental categories, and frames or schemata turn out to provide approximate descriptions of the coarse-grained behaviour of connectionist systems.”
Again, in Smolensky's words [
4] ‘
Connectionist AI systems are large networks of extremely simple numerical processors, massively interconnected and running in parallel’. In other words, sufficiently complex networked systems can give rise to behaviors and properties that we associate with intelligence, without pre-specified symbolic rules.
Thus, both in biology and in computation, we observe a fundamental departure from syntax-dominated thinking. The recognition that complex networks (whether of genes, proteins, natural or artificial neurons) can exhibit emergent, adaptive, and context-dependent behavior lays the foundation for the notion of an “Intelligence of Matter.” [
5,
6]. Matter, organized in multi-scale networks, is no longer passive; it can process information, adapt, and, in a minimal sense, “learn” from experience.
3. The Geno-Centric Illusion and Its Unraveling
The multi-level organization of biological entities as “networks-of-networks” is self-evident. Proteins interact among themselves to give rise to organized metabolism, yet each protein (a single node in such a network) is itself a network of amino-acid residues, whose coordinated motions allow systemic behaviors such as allostery. The same principle extends across the entire spectrum of organizational scales: from cells to tissues, organs, and ecological systems [
7]. Interactions at one level give rise to emergent behaviors at higher levels. This nested architecture suggests that manifestations of intelligence are not out of reach for matter itself.
Such a network perspective challenges the traditional view of information flow codified in the “central dogma” of molecular biology, i.e., a linear, unidirectional flux from DNA to RNA to proteins (and phenotypes). The rise of epigenetics and epitranscriptomics underscores that information in living systems is non-linear and context-dependent. Epigenetics, for instance, explores heritable changes in gene expression that occur without altering the underlying DNA sequence. Chemical modifications to the DNA or associated histone proteins act as instructions, turning genes “on” or “off” and controlling cellular behavior [
8]. Importantly, these modifications are influenced by environmental and lifestyle factors [
9], can persist through cell division, and in some cases are inherited across generations [
10], providing a form of cellular memory.
The term “epigenetics” itself, coined by the developmental biologist Conrad Waddington in 1942, reflected this networked view as the “whole complex of developmental processes” that lie between “genotype and phenotype”. He described development as a concatenation of processes linked together, such that disturbances at early stages could cascade into widespread changes [
11]. The interest in this field rose strongly only in the 21st century [
12]. Modern epigenetics extends Waddington’s vision and elucidates how dynamic chemical tags, installed by “writers” (e.g., DNA methyltransferases, histone acetyltransferases), interpreted by “readers” (e.g., bromodomain-containing proteins), and removed by “erasers” (e.g., histone deacetylases, demethylases), regulate gene expression in an adaptive, context-sensitive manner [
13]. Transgenerational inheritance of these marks illustrates that cellular systems not only respond to environmental stimuli but also integrate these experiences into their configuration, effectively “remembering” past events [
14]. Epigenetic modifications and associated memory are central to the development of multicellular organisms [
15]; therefore, it is not surprising that the aberrant cellular memory originating from the deregulation and disruption of the epigenetic modification process is linked to the development of various diseases, including cancer [
16].
In 2012, a related dynamic phenomenon of post-transcriptional chemical modifications of nucleic acids was termed epitranscriptomics (also known as RNA epigenetics) [
17]. Similar to the epigenetic modifications of DNA, epitranscriptomic modifications are controlled by the interplay of specific RNA-modifying enzymes and RNA-interacting proteins (‘writers’, e.g., METTL3/METTL14 complex for m
6A methylation, ‘erasers’, e.g., fat mass and obesity-associated protein (FTO), and ‘readers’, such as the YTH domain proteins) [
18], triggered by the environmental cues, can be inherited across generations [
19], and therefore represent another form of cellular memory.
Both epigenetics and epitranscriptomics involve a fine-tuned set of interactions among different molecular players in order for long-lasting adaptive phenomena to emerge, but even single protein molecules exhibit behaviors consistent with minimal intelligence
18. Such molecular intelligence, long observed in allostery, signal transduction, and enzyme regulation, is now being harnessed in the nascent field of Intelligent Soft Matter (ISM). ISM sits at the intersection of materials science, physics, and cognitive science, with an aim to create materials with life-like capabilities: perception, learning, memory, and adaptive behavior [
20,
21]. Drawing inspiration from biological systems, these materials use the intrinsic flexibility and responsiveness of soft matter to perform functions akin to cognitive processes
19. By studying and engineering matter that can “compute” through its own dynamics, ISM exemplifies how intelligence may emerge not only in living systems but in non-living, physical substrates as well when they reach sufficient complexity to support multiple equilibrium states.
4. Intelligence Emerging from Networks of Matter
The concept of “epigenetic information flux” can be considered a minimal form of intelligence. The system does not merely react passively to environmental perturbations; it integrates the “experience” of a stimulus into its configuration, creating a form of cellular memory. For example, when cells encounter stress, chemical modifications to DNA or histones are established, retained, and sometimes propagated, which allows the cell to respond more effectively to similar future events. This adaptive memory is a functional hallmark of intelligence: the system learns from past events to modulate future behavior. Similarly, the epitranscriptomic information flux contributes to minimal intelligence. PTMs of RNA, introduced in response to environmental cues, can persist within the cell and, in some cases, be inherited across generations. These modifications influence gene expression dynamically and contextually, forming a network of information that captures temporal and environmental history.
Beyond nucleic acids, single proteins also exhibit this minimal intelligence. Proteins often contain a combination of structured domains and Intrinsically Disordered Regions (IDRs), which together provide a balance between stability and flexibility. This structural duality enables proteins to respond adaptively to molecular cues, maintain functional identity while remaining sensitive to perturbations, and transmit information across molecular networks [
22]. Allosteric regulation exemplifies this principle: binding at one site induces conformational changes that propagate through the molecule, modulating activity at distant sites in a coordinated manner. In essence, the protein itself “computes” information about its environment and history.
The shift from viewing matter as a passive substrate to recognizing its capacity for minimal intelligence transforms our understanding of computation in natural sciences. Computation is no longer confined to a ‘formal list of signs’ that is, per se, devoid of any meaning. It emerges from the dynamic interactions within networks of molecules, cells, or materials. Matter, in this view, is not silent: it processes, stores, and responds to information, exhibiting adaptive behaviors that parallel learning and memory.
5. Ecosystems as Computational Reservoirs
This emergent perspective becomes even more striking at the level of ecosystems [
23]. Recent studies suggest that ecosystems process information in ways that resemble computation, using intrinsic dynamics to encode, transform, and respond to temporal data. In this context, the concept of Reservoir Computing (RC) provides a powerful framework. Unlike conventional artificial neural networks, which require extensive training of internal weights, RC harnesses the innate, transient dynamics of a complex medium, the “reservoir”, to map input signals into rich internal states [
23]. These states can then be read out to produce meaningful outputs, effectively encoding the history and context of environmental perturbations.
In ecosystems, RC manifests as Environmental Reservoir Computing (ERC). Here, the collective interactions among organisms, abiotic factors, and chemical cycles act as a natural reservoir, integrating information over time and responding adaptively to external stimuli. The intrinsic dynamics of the reservoir allow the system to retain memory, anticipate recurring perturbations, and optimize responses without centralized control or pre-specified algorithms. In this sense, the ecosystem itself performs computation, using its material and organizational properties to generate meaningful representations of its environment.
This principle represents a profound inversion of the traditional reductionist paradigm. In conventional science, semantics is derived from syntax: meaning is imposed upon matter through formal models, codes, or equations. In RC and ERC, semantics dominates syntax: the intrinsic organization and dynamics of the system generate adaptive, context-sensitive behavior, while formal representations serve only as an interpretive layer. Computation emerges not from abstract symbols but from self-organizing processes far from equilibrium, where the material properties of the system and networked interactions encode and process information.
From the perspective of scientific methodology, ERC and RC are distant from Galileo’s “great book” in form, yet they are aligned in spirit: both seek quantitative, intelligible representations of nature. The difference is conceptual: rather than imposing an external formal language onto matter, we now recognize that matter itself embodies computational and adaptive capacities. By studying these innate properties, we approach a deeper understanding of the natural language of matter, discovering that intelligence is not an abstract imposition but a property emerging from complex organization and dynamic interaction.
6. Conclusions: Toward a New Style of Doing Science
In this trajectory, the notion of an “Intelligence of Matter” ceases to be metaphorical and becomes an epistemic category for science itself. From the scale of single proteins that exhibit allosteric adaptability, to epigenetic networks encoding cellular memory, to ecosystems performing reservoir-like computation, we observe a unifying principle: matter, when organized in complex networks, expresses adaptive behaviors that cannot be reduced to syntax alone. These behaviors include memory, anticipation, context sensitivity, and plasticity, all features we traditionally ascribe to intelligence. This view reorients the scientific enterprise. If Galileo’s great book was written in mathematical symbols, the emerging perspective suggests that these symbols are not imposed from the outside but are already inscribed within the dynamics of matter. Mathematics here becomes not merely a tool of description but a lens through which we attempt to translate the computational and semantic capacities of physical and biological systems. In other words, the task of science shifts from reducing semantics to syntax, toward decoding the intrinsic semantics of matter itself.
Such a shift carries profound implications. First, it challenges the lingering reductionist dogma in biochemistry, genetics, neuroscience, and materials science, and encourages us to consider information processing as an emergent property of organization rather than a pre-programmed sequence of instructions. Second, it invites the development of new methodologies that bridge formal models with material intelligence: intelligent soft matter, adaptive biomolecular networks, or ecological computing may become laboratories where the cognitive capacities of matter are experimentally harnessed. Finally, it opens a philosophical horizon in which “intelligence” is no longer the prerogative of human-designed machines or biological nervous systems but a general property of organized matter far from equilibrium.
The rise of the intelligence-of-matter perspective may indeed represent not just a new field but a new style of doing science. It requires us to move beyond the symbolic manipulation of detached models and to engage with matter as an active participant in knowledge production. If successful, this style will not only enrich our understanding of life, cognition, and complexity but may also redefine the epistemological foundations of science, bringing us closer (paradoxically!) to Galileo’s vision: to read the great book of nature, now understood as written not only in symbols, but in the very dynamics of matter itself.
Author Contributions
Conceptualization, T.T., V.N.U., and A.G.; methodology, T.T., V.N.U., and A.G.; validation, T.T., V.N.U., and A.G.; data analysis, T.T., V.N.U., and A.G.; investigation, T.T., V.N.U., and A.G.; data curation, T.T., V.N.U., and A.G.; writing—original draft preparation, T.T., V.N.U., and A.G.; writing—review and editing T.T., V.N.U., and A.G. All authors have read and agreed to the published version of the manuscript.
Funding
TT acknowledges the Indian National Science Academy (INSA), New Delhi, India, for the INSA Associate Fellowship.
Data Availability Statement
Not applicable.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Galilei, Galileo, Il Saggiatore (Roma, Appresso Giacomo Mascardi, 1623).
- Longo, G. The Systemic Unity in Mathematics and Science: Beyond Techno-Science Myths. Systems 13, 136 (2025). [CrossRef]
- Noble, D. A theory of biological relativity: no privileged level of causation. Interface Focus 2, 55–64 (2012). [CrossRef]
- Smolensky, P. Connectionist AI, symbolic AI, and the brain. Artif Intell Rev 1, 95–109 (1987). [CrossRef]
- Shang, Y. et al. Ultra-lightweight compositionally complex alloys with large ambient-temperature hydrogen storage capacity. Materials Today 67, 113–126 (2023). [CrossRef]
- Kaspar, C., Ravoo, B. J., van der Wiel, W. G., Wegner, S. V. & Pernice, W. H. P. The rise of intelligent matter. Nature 594, 345–355 (2021). [CrossRef]
- Uversky, V. N. & Giuliani, A. Networks of Networks: An Essay on Multi-Level Biological Organization. Front Genet 12, 706260 (2021). [CrossRef]
- Dupont, C., Armant, D. R. & Brenner, C. A. Epigenetics: definition, mechanisms and clinical perspective. Semin Reprod Med 27, 351–357 (2009). [CrossRef]
- Deans, C. & Maggert, K. A. What do you mean, ‘epigenetic’? Genetics 199, 887–896 (2015).
- Lacal, I. & Ventura, R. Epigenetic Inheritance: Concepts, Mechanisms and Perspectives. Front Mol Neurosci 11, 292 (2018). [CrossRef]
- Waddington, C. H. The epigenotype. 1942. Int J Epidemiol 41, 10–13 (2012).
- Deichmann, U. Epigenetics: The origins and evolution of a fashionable topic. Dev Biol 416, 249–254 (2016). [CrossRef]
- Biswas, S. & Rao, C. M. Epigenetic tools (The Writers, The Readers and The Erasers) and their implications in cancer therapy. Eur J Pharmacol 837, 8–24 (2018). [CrossRef]
- Bove, G., Del Gaudio, N. & Altucci, L. Epitranscriptomics and epigenetics: two sides of the same coin? Clin Epigenetics 16, 121 (2024). [CrossRef]
- Henikoff, S. & Greally, J. M. Epigenetics, cellular memory and gene regulation. Curr Biol 26, R644-648 (2016). [CrossRef]
- Moosavi, A. & Motevalizadeh Ardekani, A. Role of Epigenetics in Biology and Human Diseases. Iran Biomed J 20, 246–258 (2016).
- Saletore, Y. et al. The birth of the Epitranscriptome: deciphering the function of RNA modifications. Genome Biol 13, 175 (2012). [CrossRef]
- Esteve-Puig, R., Bueno-Costa, A. & Esteller, M. Writers, readers and erasers of RNA modifications in cancer. Cancer Lett 474, 127–137 (2020). [CrossRef]
- Liebers, R., Rassoulzadegan, M. & Lyko, F. Epigenetic regulation by heritable RNA. PLoS Genet 10, e1004296 (2014). [CrossRef]
- Liu, K., Tebyetekerwa, M., Ji, D. & Ramakrishna, S. Intelligent Materials. Matter 3, 590–593 (2020).
- Baulin, V. A. et al. Intelligent soft matter: towards embodied intelligence. Soft Matter 21, 4129–4145 (2025). [CrossRef]
- Tripathi, T., Uversky, V. N. & Giuliani, A. ‘Intelligent’ proteins. Cell Mol Life Sci 82, 239 (2025).
- Chiolerio, A., Konkoli, Z. & Adamatzky, A. Ecosystem-based reservoir computing. Hypothesis paper. Biosystems 255, 105525 (2025). [CrossRef]
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).