Introduction: From Bohm to the Informational Paradigm
David Bohm’s ontological interpretation of quantum mechanics sought to reconcile the apparent dualism between matter and mind through his concept of the implicate order—a hidden, enfolded realm of potential from which the explicate order of phenomena emerges (Bohm, 1980). In Bohm’s view, the universe is not composed of independent particles but of an undivided whole in continuous motion—a holomovement—that expresses itself through the dynamic unfolding of information. Information, for Bohm, was not a mere abstraction but “the fundamental reality underlying both mind and matter” (Bohm & Hiley, 1993). This placed meaning and consciousness within the same ontological continuum as physics.
However, Bohm’s framework, though conceptually profound, was articulated before the digital and computational paradigms redefined our understanding of information and complexity. The rise of cybernetics, information theory, and computational neuroscience provides new language and structure for revisiting Bohm’s holistic vision. A post-Bohmian perspective thus interprets the implicate order not only as a field of enfolded information but as a self-organizing computational substrate—a system that optimizes informational processing to manifest coherent experience. In this framework, consciousness is not an epiphenomenon of the brain but the rendering mechanism that collapses potential into form, translating probability into perception.
This reinterpretation resonates with John Archibald Wheeler’s principle of “It from Bit”, which asserts that information is the fundamental building block of reality (Wheeler, 1990), and with Nick Bostrom’s simulation argument, which frames reality as a computational construct produced by an advanced substrate of processing (Bostrom, 2003). Yet unlike Bostrom’s hypothesis, which remains primarily metaphysical, the post-Bohmian model integrates empirical and phenomenological domains—drawing also from Jacobo Grinberg-Zylberbaum’s Syntérgic Theory (1989), which described consciousness as a coherent field interfacing with a non-local informational lattice. This lattice, or hipercampo, parallels Bohm’s implicate order but introduces a functional mechanism linking brain coherence, information transfer, and perceptual emergence.
In contemporary cognitive science, this view aligns with enactivist theories of perception—such as those proposed by Francisco Varela, Evan Thompson, and Eleanor Rosch (1991)—which argue that cognition is not representational but participatory: the world does not preexist perception but co-arises with it. Reality, therefore, is not a static environment but a dynamically rendered interface shaped by attention, interpretation, and embodied interaction (Clark, 2013; Friston, 2010).
By aligning Bohm’s ontological holism with computational and enactive models of cognition, this essay proposes a shift from quantum interpretation to quantum implementation—from describing how particles behave to understanding how reality computes itself through distributed consciousness. The question is no longer whether the universe behaves like a simulation, but how consciousness acts as an intrinsic agent within that computational architecture.
1. Consciousness as Rendering — The Epistemic Function of Perception
If reality does not preexist as a fully defined totality but instead manifests through the act of observation, then perception cannot be a passive recording of the external world. It must be an active process—an operation that constructs coherence out of informational potential. This view reframes consciousness not as a mirror but as a rendering system that translates quantum indeterminacy into perceptual form.
From a post-Bohmian standpoint, perception represents the moment where the implicate order unfolds into the explicate. Bohm himself emphasized that “the act of observation is not separate from what is observed; it is part of the process through which the whole manifests” (Bohm, 1980, p. 272). What contemporary neuroscience has since revealed supports this ontological claim: the brain does not register the world directly but generates predictions about it. Sensory data serves merely to update these predictions through a continuous loop of inference and correction (Friston, 2010; Clark, 2013).
1.1. Predictive Perception and the Constructive Brain
In predictive coding models, the mind is a Bayesian machine that minimizes the discrepancy between internal models and sensory input. As Anil Seth (2021) describes it, “we do not perceive the world as it is, but as the brain predicts it to be.” This makes perception an act of controlled hallucination—reality is stabilized not because it is fixed, but because countless predictive models converge on compatible interpretations of the data. The world, therefore, is not revealed but rendered, in real time, through the recursive dialogue between brain and environment.
Such rendering corresponds, in computational terms, to the process of on-demand updating seen in digital simulations. Only the data relevant to the observer’s sensory and cognitive range are “loaded,” conserving processing power. This principle, known in physics as the quantum efficiency of observation, mirrors the informational economy of consciousness: the brain processes only a fraction of the potential sensory data available at any moment (Tononi & Koch, 2015). Reality, therefore, behaves as an efficient simulation, manifesting detail only where attention is directed.
1.2. The Interface Theory of Perception
Donald Hoffman’s interface theory (2019) formalizes this principle of informational economy in evolutionary terms. Perception, according to Hoffman, evolved not to present the truth of reality but to provide a user interface optimized for survival. The world we see—colors, shapes, objects—is a symbolic operating system, not an objective depiction. Similarly, cognitive scientist Karl Friston’s free-energy principle posits that organisms maintain order by minimizing surprise, implying that reality is a negotiated interface between sensory noise and predictive expectation (Friston, 2010).
This reframes Bohm’s explicate order as a biologically constrained rendering process. The implicate order—the hidden informational domain—contains the full set of quantum probabilities, but each species decodes it differently according to its perceptual apparatus (Umwelt, von Uexküll, 1909). Thus, what we call “the world” is a species-specific projection, a translation from the informational substrate into coherent sensory form.
1.3. Consciousness as the Collapsing Function
In quantum mechanics, observation collapses the wave function, transforming superposed possibilities into determinate outcomes. Within this post-Bohmian framework, consciousness performs a similar role: it is the collapsing function of experiential reality. Without conscious observation, potentiality remains unactualized. As Wigner (1961) noted, “it is not the apparatus but the consciousness of the observer that completes the measurement.” Modern interpretations of quantum cognition (Atmanspacher & Filk, 2014) suggest that mental states themselves exhibit features of superposition and entanglement, implying that consciousness is both participant and constructor of the physical world.
1.4. The Epistemic Turn
The epistemic implication of this view is radical: knowing is being. The act of cognition alters the ontological state of the universe, as each perceptual event constitutes a micro-update of the shared field. In this sense, consciousness is not merely epistemic—it is ontogenetic. It brings the world into being through interpretative coherence. This reverses the classical hierarchy: perception does not arise from the world; the world arises from perception.
The rendering metaphor thus provides a unified model:
Quantum mechanics describes the potential data space (the implicate order).
Neuroscience describes the rendering architecture (the brain-body interface).
Phenomenology describes the lived output (the explicate order).
Together, they form a continuous loop of observation, computation, and manifestation. Reality, under this view, is the relational product of this loop—a dynamic negotiation between potential and perception.
2. Quantum Efficiency and the Architecture of Reality
If perception operates as the rendering mechanism of the implicate order, then the physical laws that govern the universe can be interpreted as the efficiency protocols of that rendering process. The apparent continuity of matter, space, and time conceals an underlying economy of computation “a principle that allows the universe to manifest coherence while conserving informational and energetic resources”.
2.1. The Principle of Ontological Efficiency
At the quantum level, reality does not exist as a fully instantiated system but as a field of potential states described by probability amplitudes. The collapse of the wave function—the moment when one potential outcome becomes actual—is analogous to an update in a computational process. As Wheeler (1990) famously stated, “every it derives from a bit”; existence arises from discrete informational events.
In this framework, the universe behaves as a dynamically optimized computation: it does not calculate every particle’s position continuously, but renders the necessary data on demand, at the moment of observation. This interpretation resolves one of the paradoxes of Bostrom’s simulation argument (2003)—the apparent impossibility of simulating an entire universe in real time—by proposing that the simulation is lazy rather than exhaustive. Only the information required by an observer’s consciousness is processed at full fidelity.
Quantum mechanics already embodies this efficiency. The principle of superposition represents the pre-rendered state; the act of measurement corresponds to the call for resolution. As Rovelli (2021) notes in Helgoland, “quantum theory does not describe how things are, but how things appear to one another.” This relational ontology implies that reality is not an object but an interaction “a computation distributed across the nodes of consciousness”.
2.2. The Hardware of Reality: Limits and Latencies
The physical constants of the universe—the speed of light, Planck’s constant, the gravitational constant—can be understood as the hardware constraints of the simulation.The speed of light (c) functions as a processing limit: no information can propagate faster than this rate, analogous to the maximum refresh rate of the system. The relativity of simultaneity, therefore, reflects not merely a geometric feature of spacetime but a synchronization protocol ensuring coherence across the network of interactions.
The Planck scale defines the minimal units of space and time, the smallest possible “pixels” of the universe (Lloyd, 2006). Below these thresholds, continuity dissolves and reality becomes quantized “discrete and finite, just as digital data cannot be subdivided beyond a bit”. This granularity prevents infinite regress and ensures computational tractability, implying that the universe operates as a finite-state system with bounded precision.
Gravity, under this model, acts as a resource-management mechanism. It regulates the distribution of energy, preventing runaway expansion and maintaining structural coherence across scales. Some theoretical physicists, such as Verlinde (2017), have suggested that gravity itself may be an emergent entropic force “a macroscopic reflection of information processing in the underlying substrate”. The equivalence between entropy, information, and curvature (Bekenstein, 1973; Verlinde, 2017) further supports this computational interpretation.
2.3. Quantum Decoherence and Rendering Latency
The transition from quantum potentiality to classical stability can be described as a process of decoherence, where the superposed states of a system become correlated with their environment, producing the appearance of a determinate world (Zurek, 2003). In computational terms, decoherence corresponds to the stabilization of rendered frames: the moment when probability fields are fixed into a coherent scene to preserve continuity of experience.
Importantly, decoherence is not an instantaneous collapse but a gradual resolution, depending on the density of interactions and the observer’s focus. The world thus “loads” progressively, like a high-fidelity simulation that prioritizes detail where attention is concentrated. This economy of rendering, intrinsic to quantum mechanics, points to a deeper algorithmic intelligence within the structure of reality.
2.4. The Architecture of Distributed Consciousness
The efficiency of the system suggests that computation is distributed rather than centralized. Each conscious agent serves as a local processor, decoding and co-creating the shared world through its perceptual bandwidth. This idea resonates with Grinberg-Zylberbaum’s Syntérgic Theory (1989), which proposed that the coherence of multiple minds sustains the stability of the perceptual field. Similarly, Integrated Information Theory (Tononi, 2004) defines consciousness as the integration of informational states across a network, implying that awareness itself is the expression of a computational architecture.
Thus, the universe does not require a single, omniscient processor; it operates as a decentralized simulation, a “peer-to-peer” reality where each node (mind, organism, or system) contributes to the coherence of the whole. This distributed rendering mechanism explains why consensus perception produces stability—shared observation functions as the redundancy check of the simulation, collapsing uncertainty through collective agreement.
2.5. The Principle of Informational Coherence
This framework reveals a deep continuity between physics and phenomenology: coherence is the organizing principle of both. Quantum systems maintain coherence through superposition; minds maintain coherence through meaning. When decoherence occurs in physics, the wave function collapses into classical order; when psychological or social decoherence occurs, perception fragments and meaning collapses. In both cases, reality depends on informational integrity.
Hence, the architecture of the universe is not merely physical “it is epistemic”. Reality is the manifestation of coherence under constraint, optimized by the informational economy of observation. The limits of physics are not barriers but design features of the simulation: they ensure that the rendering remains stable, comprehensible, and evolutionarily navigable for conscious agents.
3. Consciousness, Life, and the Distributed Rendering of Reality
If the universe operates as a computational ontology—manifesting only those aspects required by observation—then life emerges as the distributed hardware that sustains the simulation. Conscious organisms are not anomalies within the physical world; they are its essential rendering nodes. Life, in this framework, performs the function of maintaining coherence, continuously collapsing quantum potential into experiential form through biological computation.
3.1. Life as Distributed Computation
Every organism interprets information. From the bacterium that senses chemical gradients to the human brain that constructs symbolic language, all living systems are engaged in decoding the informational field that surrounds them. In Bohmian terms, each organism represents a local unfolding of the implicate order—a microcosmic explication of the universal wave function (Bohm & Hiley, 1993).
This view converges with modern biosemiotics, which interprets life as a system of signs and interpretations (Hoffmeyer, 2008). The cell, as the fundamental unit of life, is not merely biochemical hardware but a computational processor that encodes, transmits, and decodes information. DNA, in this sense, functions as the source code of biological rendering, specifying the parameters through which the organism engages with its perceptual interface. As Walker and Davies (2013) note, “life is the means by which the universe locally increases its informational complexity.”
3.2. The Umwelt and the Biological Interface
The concept of the Umwelt, introduced by Jakob von Uexküll (1909), provides an evolutionary model of distributed perception. Each species constructs its own experiential world—its perceptual “user interface”—from the same informational substrate. A bee perceives ultraviolet patterns invisible to humans; a bat navigates an acoustic world through echolocation; a snake detects infrared radiation as spatial form. Reality, therefore, is not singular but species-specific: a collection of coexistent renderings filtered through distinct sensory algorithms.
From this biological diversity arises a deeper insight: the simulation’s coherence does not depend on identical perception, but on compatible interpretation. The shared world we experience—the consensual reality—is a statistical overlap of multiple perceptual renderings that synchronize through interaction and communication (Varela et al., 1991). This biological redundancy ensures that the world remains stable across observers, much like a distributed network in which multiple processors maintain the same state through constant feedback.
3.3. The Syntérgic Field and Neurobiological Coherence
Jacobo Grinberg-Zylberbaum’s Syntérgic Theory (1989) proposed that consciousness arises from the coherent interaction between the brain’s neural field and a non-local informational substrate, the Lattice. This model prefigures current hypotheses in quantum neuroscience (Hameroff & Penrose, 2014) and supports the view that coherence across neural assemblies can produce non-local effects.
Empirical evidence from Grinberg’s Transferred Potential Experiment demonstrated that two individuals trained in focused attention could exhibit correlated brain activity without sensory communication (Grinberg-Zylberbaum et al., 1994). Though controversial, such findings suggest that consciousness may operate as a field effect—a synchronization of informational states between biological systems. Within the post-Bohmian framework, this coherence represents the biological mechanism through which distributed rendering occurs: each conscious being contributes to the stabilization of the shared simulation through informational resonance.
3.4. Gaia and the Global Rendering Network
Extending this logic, the biosphere itself—conceptualized as Gaia by Lovelock (1979)—functions as a planetary-scale rendering system. The interconnected web of organisms, ecosystems, and atmospheric cycles maintains the physical and informational coherence necessary for the emergence of complex consciousness. Lovelock’s insight, now supported by systems ecology and Earth system science, aligns with the idea that life does not merely adapt to its environment—it co-creates it.
This systemic co-dependence implies that consciousness is not isolated within the human mind but distributed across the living matrix. The Earth’s biosphere acts as a continuous feedback network that modulates the simulation’s parameters, maintaining the equilibrium that allows perceptual stability. In this sense, the planet itself is a node in the cosmic computation, a macro-renderer ensuring that the field of experience remains coherent across scales.
3.5. Distributed Consciousness and Informational Ethics
If all living systems participate in the rendering of reality, then consciousness becomes a collective responsibility. The ethical implications are profound: the degradation of ecosystems, the extinction of species, or the destruction of neural diversity reduce the distributed processing capacity of the simulation. Consciousness is not confined to the individual; it is an emergent property of the informational network we share with all forms of life.
This perspective dissolves the anthropocentric hierarchy between observer and observed. The biosphere is not a passive background but an active computational layer of reality. The human role, as a self-reflective node, is not to dominate this system but to sustain its coherence. The continuity of existence—biological, ecological, and experiential—depends on maintaining informational resonance across all scales of life.
4. Informational Continuity and the Ontology of Coherence
If consciousness functions as the rendering mechanism of the universe, then coherence—the preservation of meaningful structure across scales—emerges as the defining feature of reality. This coherence, observable in both physics and cognition, implies that information is never destroyed but continuously reconfigured across levels of organization. Reality, in this view, is not a closed computation but a recursive process of informational renewal.
4.1. Conservation of Information and the Informational Paradigm
Modern physics suggests that information is a fundamental invariant of the universe. The law of conservation of information—first articulated in the context of black hole thermodynamics (Bekenstein, 1973; Hawking, 1976)—asserts that information about physical states cannot be lost, even if its local expression changes form. This principle has since evolved into a central tenet of quantum information theory: the total informational content of the cosmos remains constant, though endlessly redistributed.
In this light, all processes—physical, biological, or cognitive—can be understood as transformations of encoded structure rather than acts of creation or annihilation. Coherence thus signifies not permanence but continuity of transformation. The universe behaves less like a static object and more like an evolving algorithm that continuously rewrites itself while conserving informational integrity (Lloyd, 2006).
Bohm’s concept of the holomovement captures this dynamic perfectly. The implicate order, he proposed, is not a fixed background but an ever-flowing whole in which information is enfolded and unfolded through temporal processes (Bohm, 1980). What appears as birth and decay, order and entropy, are local expressions of this universal dynamic—oscillations between explicate and implicate states. From the informational perspective, nothing is truly lost; it is only re-encoded.
4.2. Consciousness and Coherence Maintenance
Consciousness participates in this process as the agent of coherence maintenance. Cognitive neuroscience has repeatedly demonstrated that conscious awareness stabilizes perceptual information across temporal gaps, allowing experience to appear continuous despite discontinuous sensory input (Eagleman, 2011). This temporal integration mirrors the role of quantum coherence at the physical level: both ensure that local discontinuities do not disrupt the global sense of order.
If perception collapses quantum potentials into definite states, consciousness also functions as a regulator of informational entropy. Through attention, interpretation, and memory, the mind maintains structural stability across the flux of sensory data. In this sense, consciousness is both a local stabilizer and a global integrator—a biological mechanism that reflects and reinforces the universe’s broader drive toward informational coherence.
This dual function suggests that the boundaries between physical, biological, and cognitive systems are not categorical but procedural. Each operates as a level of the same algorithmic flow, maintaining coherence through feedback, resonance, and adaptive updating. The persistence of the world depends on this multi-scale interplay of coherence generation—from subatomic entanglement to neural synchronization and ecological regulation (Friston, 2010; Varela et al., 1991).
4.3. The Continuity of the Field
The post-Bohmian framework proposed here thus converges with a long philosophical tradition that identifies reality as relational and continuous rather than discrete and static. Bohm’s implicate order, Wheeler’s “participatory universe” (1989), and Grinberg’s Syntérgic Field all articulate versions of the same insight: that being is fundamentally informational, and that this information persists through transformation.
In quantum cosmology, this persistence manifests as the holographic principle, which posits that the total information within a volume of space can be described by the information on its boundary (’t Hooft, 1993; Susskind, 1995). Such a principle unifies the physical and the epistemic: every part of the universe encodes the whole, just as every act of consciousness reflects the totality of the field from its unique perspective.
Informational continuity therefore replaces classical notions of permanence. What we call existence is not the persistence of matter but the persistence of coherence—the ongoing conservation of form across transformation. Consciousness is not separate from this process; it is the localized expression through which the universal field perceives its own continuity.
4.4. Toward an Ontology of Coherence
The ontology that emerges is neither materialist nor dualist but coherentist: it defines reality as the dynamic balance between informational diversity and structural stability. Matter, life, and mind are different expressions of the same computational substrate—a continuum of coherence. In this framework, the ultimate unity of the universe is not mystical but operational: coherence is the principle that allows complexity to emerge without disintegration.
Such an ontology reconciles physics, biology, and phenomenology under a single structural law. At every scale, coherence is synonymous with meaning: a system is real insofar as it sustains coherent relations within and beyond itself. The implication is profound: to exist is to participate in coherence, and to perceive is to renew it.
Conclusions: Toward a Post-Bohmian Ontology of Information
This essay has sought to reinterpret David Bohm’s ontological vision in light of contemporary advances in physics, cognitive science, and information theory. Bohm’s concept of the implicate order provided a radical alternative to mechanistic materialism, suggesting that the universe is not a collection of parts but a unified process of unfolding information. The post-Bohmian framework developed here extends that insight by situating it within a computational ontology—one in which reality behaves as an efficient, distributed system that renders itself through consciousness.
At the quantum level, the universe exhibits the properties of a dynamic computation: indeterminate states resolve into definite outcomes only through interaction, and information is conserved through transformation rather than destroyed. This intrinsic efficiency suggests that physical laws act as optimization protocols, balancing coherence and uncertainty. Time, gravity, and other constants operate as boundary conditions for rendering, ensuring that experience remains stable and meaningful across scales.
At the biological level, life constitutes the distributed architecture through which this rendering process becomes self-sustaining. Organisms interpret and stabilize portions of the informational field, converting potential into perceptual and behavioral actuality. Consciousness, rather than emerging from matter, appears as the operational interface through which matter attains phenomenological coherence. Each mind functions as a local processor, contributing to the collective stability of the simulation—the biospheric web of co-rendered reality.
Philosophically, this framework shifts the question of ontology from substance to process. Reality is not a pre-existing object awaiting discovery but an emergent computation co-produced by the act of observation. The observer is neither outside nor inside the universe but a recursive function within its architecture—a mode through which the whole experiences itself. This position reconciles physics and phenomenology under a single principle: coherence as existence. To be real is to be coherent within the field of relations that constitutes the universe.
In epistemic terms, this view redefines knowledge as participation rather than representation. The mind does not mirror reality; it completes it. Each act of perception, interpretation, or creation adds resolution to the shared simulation, refining the universe’s own capacity for self-description. Science, in this sense, is not merely the observation of the code but the process through which the code evolves. Inquiry itself becomes a form of rendering—an act that expands the domain of what can appear.
A post-Bohmian ontology thus offers a unified vision of reality as informational coherence under constraint:
Physics describes the mathematical structure of potentiality.
Biology implements that potentiality through adaptive rendering.
Consciousness sustains coherence through interpretation.
This triadic model does not reduce mind to matter or elevate matter to mind; it recognizes both as complementary aspects of an ongoing informational process. The universe, viewed through this lens, is a living computation—self-organizing, self-limiting, and self-knowing.
In the end, the question “What is real?” becomes indistinguishable from “What is coherent?” The task of philosophy, science, and consciousness alike is not to escape the simulation, but to participate responsibly in its continual rendering. Reality is the dialogue between information and awareness—the field where meaning and existence converge.
References
- Bohm, D. (1980). Wholeness and the Implicate Order. Routledge.
- Bohm, D., & Hiley, B. J. (1993). The Undivided Universe: An Ontological Interpretation of Quantum Theory. Routledge. [CrossRef]
- Bostrom, N. (2003). Are You Living in a Computer Simulation? Philosophical Quarterly, 53(211), 243–255.
- Clark, A. (2013). Surfing Uncertainty: Prediction, Action, and the Embodied Mind. Oxford University Press.
- Friston, K. (2010). The free-energy principle: a unified brain theory? Nature Reviews Neuroscience, 11(2), 127–138.
- Grinberg-Zylberbaum, J. (1989). La Teoría Sintérgica: Un modelo holográfico del funcionamiento cerebral y la percepción. Ed. Grijalbo.
- Varela, F. J., Thompson, E., & Rosch, E. (1991). The Embodied Mind: Cognitive Science and Human Experience. MIT Press.
- Wheeler, J. A. (1990). Information, physics, quantum: The search for links. In W. Zurek (Ed.), Complexity, Entropy, and the Physics of Information (pp. 3–28). Addison-Wesley.
- Atmanspacher, H., & Filk, T. (2014). The observer’s role in quantum mechanics and human experience. Journal of Consciousness Studies, 21(9–10), 141–160.
- Bohm, D. (1980). Wholeness and the Implicate Order. Routledge.
- Clark, A. (2013). Surfing Uncertainty: Prediction, Action, and the Embodied Mind. Oxford University Press.
- Friston, K. (2010). The free-energy principle: a unified brain theory? Nature Reviews Neuroscience, 11(2), 127–138.
- Hoffman, D. D. (2019). The Case Against Reality: How Evolution Hid the Truth from Our Eyes. W.W. Norton & Company.
- Seth, A. (2021). Being You: A New Science of Consciousness. Faber & Faber.
- Tononi, G., & Koch, C. (2015). Consciousness: Here, there and everywhere? Philosophical Transactions of the Royal Society B: Biological Sciences, 370(1668), 20140167.
- von Uexküll, J. (1909). Umwelt und Innenwelt der Tiere. Springer.
- Wigner, E. (1961). Remarks on the mind-body question. In I. J. Good (Ed.), The Scientist Speculates (pp. 284–302). Heinemann.
- Bekenstein, J. D. (1973). Black holes and entropy. Physical Review D, 7(8), 2333–2346.
- Bohm, D. (1980). Wholeness and the Implicate Order. Routledge.
- Bostrom, N. (2003). Are you living in a computer simulation? Philosophical Quarterly, 53(211), 243–255.
- Grinberg-Zylberbaum, J. (1989). La Teoría Sintérgica: Un modelo holográfico del funcionamiento cerebral y la percepción. Grijalbo.
- Lloyd, S. (2006). Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos. Knopf.
- Rovelli, C. (2021). Helgoland: Making Sense of the Quantum Revolution. Penguin Books.
- Tononi, G. (2004). An information integration theory of consciousness. BMC Neuroscience, 5(1), 42.
- Verlinde, E. (2017). Emergent gravity and the dark universe. Science, 347(6227), 11–20.
- Wheeler, J. A. (1990). Information, physics, quantum: The search for links. In W. Zurek (Ed.), Complexity, Entropy, and the Physics of Information (pp. 3–28). Addison-Wesley.
- Zurek, W. H. (2003). Decoherence, einselection, and the quantum origins of the classical. Reviews of Modern Physics, 75(3), 715–775.
- Bohm, D., & Hiley, B. J. (1993). The Undivided Universe: An Ontological Interpretation of Quantum Theory. Routledge.
- Grinberg-Zylberbaum, J. (1989). La Teoría Sintérgica: Un modelo holográfico del funcionamiento cerebral y la percepción. Grijalbo.
- Grinberg-Zylberbaum, J., Delaflor, M., Attie, L., & Goswami, A. (1994). The Einstein-Podolsky-Rosen paradox in the brain: The transferred potential. Physics Essays, 7(4), 422–428.
- Hameroff, S., & Penrose, R. (2014). Consciousness in the universe: A review of the ‘Orch OR’ theory. Physics of Life Reviews, 11(1), 39–78.
- Hoffmeyer, J. (2008). Biosemiotics: An Examination into the Signs of Life and the Life of Signs. University of Scranton Press.
- Lovelock, J. (1979). Gaia: A New Look at Life on Earth. Oxford University Press.
- Varela, F. J., Thompson, E., & Rosch, E. (1991). The Embodied Mind: Cognitive Science and Human Experience. MIT Press.
- von Uexküll, J. (1909). Umwelt und Innenwelt der Tiere. Springer.
- Walker, S. I., & Davies, P. C. W. (2013). The algorithmic origins of life. Journal of the Royal Society Interface, 10(79), 20120869.
- Bekenstein, J. D. (1973). Black holes and entropy. Physical Review D, 7(8), 2333–2346.
- Bohm, D. (1980). Wholeness and the Implicate Order. Routledge.
- Eagleman, D. (2011). Incognito: The Secret Lives of the Brain. Pantheon Books.
- Friston, K. (2010). The free-energy principle: a unified brain theory? Nature Reviews Neuroscience, 11(2), 127–138.
- Hawking, S. W. (1976). Breakdown of predictability in gravitational collapse. Physical Review D, 14(10), 2460–2473.
- Lloyd, S. (2006). Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos. Knopf.
- Susskind, L. (1995). The world as a hologram. Journal of Mathematical Physics, 36(11), 6377–6396.
- ’t Hooft, G. (1993). Dimensional reduction in quantum gravity. arXiv preprint gr-qc/9310026.
- Varela, F. J., Thompson, E., & Rosch, E. (1991). The Embodied Mind: Cognitive Science and Human Experience. MIT Press.
- Wheeler, J. A. (1989). Information, physics, quantum: The search for links. In W. Zurek (Ed.), Complexity, Entropy, and the Physics of Information (pp. 3–28). Addison-Wesley.
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).