Preprint
Article

This version is not peer-reviewed.

The Second Law of Infodynamics: A Thermocontextual Reformulation

A peer-reviewed article of this preprint also exists.

Submitted:

21 October 2024

Posted:

22 October 2024

You are already at the latest version

Abstract
Vopson and Lepadatu recently proposed the Second Law of Infodynamics. The law states that while total entropy increases, information entropy declines over time. They state that the law has applications over a wide range of disciplines, but they leave many key questions unanswered. This article analyzes and reformulates the law based on the thermocontextual interpretation (TCI). The TCI generalizes Hamiltonian mechanics by defining states and transitions thermocontextually, with respect to an ambient-temperature reference state. The TCI partitions energy into ambient heat, which is thermally random and unknowable, and exergy, which is knowable and usable. The TCI is further generalized here to account for a reference observer’s actual knowledge. This enables partitioning exergy into accessible exergy, which is known and accessible for use, and configurational energy, which is knowable but unknown and inaccessible. The TCI is firmly based on empirically validated postulates. The Second Law of thermodynamics and its information-based analogue, MaxEnt are logically derived corollaries. Another corollary is a reformulated Second Law of Infodynamics. It states that an external agent seeks to increase its access to exergy by narrowing its information gap with a potential exergy source. The principle is key to the origin of self-replicating chemicals and life.
Keywords: 
;  ;  ;  ;  ;  ;  

1. Introduction

Melvin Vopson and S. Lepadatu introduced the Second Law of Infodynamics in an article they published in 2022 [1]. They resolved a system’s total entropy into physical entropy and information entropy. The system’s total entropy is constant or increases over time, in compliance with the Second Law of thermodynamics, but, according to the Second Law of Infodynamics, information entropy remains constant or declines over time. Vopson and Lepadatu state that the Second Law of Infodynamics has important applications to diverse fields, including genetics, evolutionary biology, virology, computing, big data, physics, and cosmology.
A system’s total statistical entropy is defined by σ = i = 1 N P i l n ( P i ) , where Pi is the probability of the system being in the mechanical microstate i, and N is the number of available microstates. Physical entropy is the Gibbs entropy of thermodynamics, which is summed over thermalized microstates. Thermalized microstates are random, and they therefore have no information content. Information entropy, in contrast, is summed over information-bearing microstates. Information-bearing microstates are not random; they are just unknown. They reveal their information content by measurement and observation. Information entropy is the entropy of statistical mechanics.
Information entropy is closely related to Shannon entropy, as defined by Claude Shannon in his seminal article on information theory [2]. Shannon entropy describes the average level of surprise or uncertainty inherent to a variable's possible outcomes upon its measurement and observation [3]. The surprise factor is a measure of the information gap between a system’s actual information content and an observer’s expectations.
Vopson and Lepadatu provide two examples to illustrate the spontaneous decline in information entropy. The first example describes the decay of a magnetically stored digital record with definite, but unknown, content. Magnetic relaxation reduces the storage medium’s information content, and over time, this reduces an observer’s surprise factor upon reading it.
The second example describes the change in the information entropy of SARS-CoV-2 recorded in its RNA between January 2020 and October 2021. The virus’s RNA comprises four bases (adenine, cytosine, guanine, and uracil), with a total connected length of n=29,903 bases. Entropy is summed over the N = n ! n A ! n C ! n G ! n U ! different linear configurations of the four bases. Based on equal numbers of each base and Sterling’s approximation, the maximum theoretical information entropy is σ=−nlog2(¼)=59,806 bits. Figure 3 in [1] shows a decline in information entropy of the RNA over the period from 40,568 to 40,553 bits.
The decline in information entropy in the first example is a direct consequence of thermalization and randomization of the magnetic moments of the storage medium. Thermal equilibration is based on the Second Law of thermodynamics. However, the Hamiltonian conceptual framework (HCF), on which classical and quantum mechanics are interpreted, does not formally recognize entropy or irreversibility as fundamental properties. It cannot formally recognize either the Second Law of thermodynamics or the Second Law of Infodynamics as fundamental laws of physics.
The decline in information entropy in the SARS virus example is not immediately expected. Unbiased random mutations by themselves would be expected to increase a system’s information entropy. This raises a deeper question about the meaning of entropy. E. T. Jaynes [4] considered objective randomness to be incompatible with mechanics. He concluded that information entropy is based on an observer’s incomplete knowledge and that its assignment of probabilities represented the observer’s subjective expectations and bias. To eliminate subjective observer bias, he asserted that initial probabilities should be based on zero prior information. With zero information, all configurations have equal-probability expectations. This is the statistical mechanical interpretation of entropy, and it is the information entropy of Vopson and Lepadatu.
One alternative to Jaynes’s procedure for assigning initial probabilities is to base expectations on a system’s most recent measurement. If zero measurement error and no mutations are assumed, the initial entropy would then be zero. This represents a perfectly precise description, but it almost certainly would not be accurate. A more realistic statistical description would be to assign a positive entropy to reflect measurement error and random changes. Another alternative would be to calculate the Gibbs entropy based on the configurations’ energies and the Boltzmann-Gibbs probability distribution function [5].
The evolution of a virus in some sense does represent a gain in information, reflecting its evolving ability to evade antibodies and to manipulate its target genomes for its own reproduction. A principle describing nature’s tendency to acquire and act on information is essential to understand the origin and evolution of life. However, statistical entropy is subjectively based on an observer’s prior knowledge or on arbitrary assumptions. It is ill-defined, and it is not suitable to define a physical law.
This paper reformulates the Second Law of Infodynamics within the framework of the thermocontextual interpretation (TCI). The TCI was previously introduced as an alternative to Hamiltonian mechanics [6]. The TCI defines a system’s state thermocontextually with respect to a reference state in equilibrium with the ambient surroundings at a positive ambient temperature. It recognizes entropy, exergy, thermal randomness, and irreversibility as physical properties of states and transitions. The TCI provides a unified foundation for mechanics and thermodynamics, and it resolves some of the fundamental questions of physics regarding time, causality, and quantum entanglement [6].
The TCI is updated here to describe generalized transitions, which can result from changes in the system, in the system’s environment, or both. It contextually describes transitions with respect to a fixed reference observer and measurement framework. The TCI is firmly based on empirically validated postulates. The Second Law of thermodynamics, its information-based analogue MaxEnt [4,7], and an objectively defined reformulation of the Second Law of Infodynamics are all logically derived corollaries of the TCI’s postulates.

2. The Two Entropies

Rudolf Clausius introduced entropy in 1850 and defined it as the ratio of heat (q) and absolute temperature (T):
d S d q T
Clausius then proceeded to formulate the Second Law of thermodynamics in terms of the irreversible increase in thermodynamic entropy. His formulation was in the spirit of Sadi Carnot’s conjecture, based on his work on steam engines. Carnot had concluded that the actual output of work is less than a system’s idealized potential for work due to irreversible losses. The Second Law states that work potential is irreversibly dissipated to heat of the ambient surroundings. It further states that ambient heat has zero potential for work on the ambient surroundings.
Shortly after Clausius defined thermodynamic entropy, Ludwig Boltzmann defined entropy statistically within the Hamiltonian framework of classical mechanics. He defined entropy by the number of mechanical microstate configurations consistent with a system’s thermodynamic macrostate description. In the idealized dissipation-free world of Hamiltonian mechanics, however, all information and work potential are conserved. There is no distinction between past and future, and there is no arrow of time. The HCF defines a system’s physical state by perfect measurement in the absence of thermal noise, and it is the foundation for the interpretations of classical, quantum, and relativistic mechanics.
Following Boltzmann’s initial formulation of statistical entropy, Willard Gibbs applied it to thermally equilibrated systems and defined what is now known as Gibbs entropy:
S G i b b s k B i = 1 N P i ln P i ,
where kB is Boltzmann’s constant, N is the number energy states, and Pi is the probability that a measurement reveals a microstate with energy Ei. Gibbs entropy is equal to the Third-Law entropy of thermodynamics, where the Third-Law entropy is defined with respect to absolute zero:
S G i b b s = S 3 r d L 0 K T s y s d q T = 0 K T s y s C v d T T .
C v = q T v is the volumetric heat capacity and a property of state. The Gibbs entropy and Third-Law entropy are expressed in units of energy per kelvin, and they are denoted by the letter “S.”
The Third-Law entropy (3) is a thermodynamic property of state. Gibbs entropy, in contrast, expresses the statistical distribution of energy measurements. The equality of Gibbs and Third-Law entropies is the basis for the Boltzmann-Gibbs distribution function [5]. The Boltzmann-Gibbs distribution defines the probability Pi of measuring a microstate with energy Ei by:
P i = e E i k B T / Z w i t h i = 1 N P i = 1 .
T is the equilibrium temperature of thermalization, and Z is a normalization factor for probabilities to sum to one. The higher a state’s energy is, the lower is its probability of being measured.
The Second Law of thermodynamics addresses the irreversible increase of entropy, but increasing entropy has taken on two fundamentally different meanings: dissipation and dispersion. Thermodynamics describes dissipation as the irreversible increase in thermodynamic entropy by heat flow or frictional forces. Dispersion, in contrast, describes the spontaneous increase in an observer’s uncertainty of a system’s microstate configuration. Classical mechanics attributes the statistics of dispersion to classical chaos and to an observer’s subjective uncertainty of the actual microstate.
Classical mechanics is known to break down at low temperatures and small scales, and quantum mechanics defines the quantum microstate by a pure wavefunction. Extending the classical concept of a microstate as being completely defined by perfect measurement, quantum mechanics defines measurement tomographically, as an ensemble of all possible measurements [8]. The statistical distribution of individual measurements constitutes a single tomographic measurement and a well-defined measurable description of a pure quantum state, with zero uncertainty and zero statistical mechanical entropy.
Quantum mechanics describes wavefunction collapse as the transition from a pure superposed microstate configuration to a mixed macrostate. A mixed macrostate is defined by the statistical distribution of measurable microstate configurations following wavefunction collapse. The statistics of a mixed macrostate’s measurement results is expressed by the von Neumann entropy. However, the intrinsic determinism of the Schrödinger wavefunction does not formally accommodate randomness or wavefunction collapse. Quantum mechanics does not address whether quantum randomness is physical, whether it reflects hidden variables, or whether it is triggered by observation. Interpretations of quantum randomness are actively debated [9] and there is no consensus interpretation.
Claude Shannon significantly extended the application of statistical entropy with his publication A Mathematical Theory of Communication in 1948 [2]. He introduced Shannon entropy to describe a message receiver’s uncertainty of a statement’s precise meaning. Shannon entropy is mathematically identical, up to a constant multiplier, to Gibbs entropy and to the von Neumann entropy. All three statistical entropies are specific cases of a generalized statistical entropy, which can be defined by:
σ i = 1 N P i ln P i .
N is the number of resolvable microstate configurations, and Pi is the probability that a measurement will reveal microstate i. Statistical entropy is unitless, and we denote statistical entropy by sigma (σ). As defined, statistical entropy only describes the statistics and uncertainty of observation results. It does not offer a physical interpretation of the probabilities.
We can also express thermodynamic entropy as a unitless statistical entropy, which we denote as thermal entropy and define by:
σ q S G i b b s / k B .
Thermal entropy is based on Boltzmann-Gibbs probabilities (4), which describe the statistical distribution of observations for thermalized microstates.
Thermodynamics and statistical mechanics both describe spontaneous increases in entropy, but they differ fundamentally in their interpretation of entropy. Thermodynamics interprets entropy as the thermal randomness of thermalized energy states. Mechanics, in contrast, does not recognize objective randomness. It interprets the increase in informational entropy as an observer’s increased uncertainty of the actual microstate configuration, which is definite and evolves deterministically. E.T. Jaynes referred to the increase in information entropy as MaxEnt [4].
The Hamiltonian conceptual framework is the universally accepted framework for interpreting mechanics—classical, quantum, and relativistic. It regards thermodynamics as an incomplete description, and entropy and irreversibility as subjective properties of perception, rather than as objective properties of physical states. But at the same time, it recognizes the Second Law of thermodynamics and MaxEnt as two descriptions of a thoroughly validated empirical law. As pointed out by Takahiro Sagawa, however, there is no a priori relationship between thermodynamic entropy and informational entropy [10]. Thermodynamic and information entropies are two entirely different entropies, and their increases reflect two distinct empirically validated laws.

3. The Thermocontextual State

Crecraft introduced the thermocontextual interpretation (TCI) as an alternative to the Hamiltonian conceptual framework (HCF) in “Time and Causality: A Thermocontextual Perspective” [6]. The article defined system time as a complex property of state spanning irreversible thermodynamic time and reversible mechanical time, and it recognized relativistic time, as measured by an external clock, as a property of the ambient surroundings. The article provided a simple explanation for Bell-type experiments [11], which document nonlocal superluminal correlations of measurements on spatially separated entangled particles. The TCI provides a commonsense explanation of the experimental results without the need for splitting universes, hidden variables, superluminal action, or superdeterminism, which are typically invoked to explain the experimental results.

3.1. TCI Postulates of State

TCI’s description of states is based on the laws of physics, including thermodynamics’ First Law of energy conservation, and the following additional postulates:
Postulate One: 
(0th Law of thermodynamics.) Temperature is a measurable property of state.
Postulate Two: 
(based on 3rd Law of thermodynamics.) A system’s surroundings is always at a positive temperature.
Postulate Three 
: No hidden variables—a system’s state properties are observable by perfect measurement and observation.
Postulates One and Two are directly based on the Zeroth and Third Laws of thermodynamics. The Zeroth Law states that temperature is a measurable property, and the Third Law states that absolute zero temperature can never be reached. Postulates One and Two establish that the minimum temperature of a system’s surroundings, with which the system can potentially interact, is positive. This minimum temperature defines the system’s positive ambient temperature. Postulates One and Two are empirically well established.
Postulate Three explicitly rejects an assumption for which there can be no empirical evidence. It states that there are no hidden variables. This means that there exists, in principle, a perfect observer that can perfectly measure a system’s complete state with zero uncertainty.
It is important to note, however, that Postulate Three does not imply that a system always exists as a state. The logically equivalent contrapositive of Postulate Three is that if a system’s state is not completely observable, then there is no state. This describes a system in transition between states. A state is static, and it exists only for a kinetically frozen metastable system or a system in equilibrium with its fixed ambient surroundings. A non-equilibrium state can spontaneously transition to a more stable microstate, during which it is in transition between states, and it does not exist as a state.
The statistical distribution of measurements for a thermally equilibrated system is given by the empirically validated Boltzmann-Gibbs distribution function (4). With no hidden variables, the probabilities cannot be deterministically tied to unobservable state properties. Consequently, measurements are not just statistical, they are intrinsically random. This is expressed by Corollary 3-1:
Corollary 3-1: 
Intrinsically statistical measurements of a thermally equilibrated macrostate reflect intrinsic thermal randomness.
Postulates One to Three provide the framework for defining a system’s objectively defined thermocontextual state, as described in the following sections. The postulates apply equally well to classical, quantum, and thermodynamic states.

3.2. Thermocontextual Properties of State

Thermocontextual properties of state are defined relative to a system’s ambient reference state (ARS) [6,12]. The ARS is defined as an equilibrium state at the ambient temperature and pressure of the surroundings. The ambient temperature and pressure are defined by the minimum temperature and pressure of the surroundings, with which the system can interact. Ambient surroundings are defined as idealized equilibrium surroundings at the ambient temperature and pressure.
When a technician inserts a thermometer into an ice bath to measure its temperature, the thermometer is part of the external measurement setup and physical surroundings. The probe’s temperature defines the observer’s reference temperature. Since the ambient temperature is defined as the minimum external temperature with which the system can interact, the ambient temperature cannot exceed the external probe’s reference temperature (Ta≤Tref). Similarly, the probe’s temperature cannot exceed the system temperature. (Tref≤Tsys). Otherwise, the probe’s greater thermal randomness would preclude perfect measurement and observation, which is codified by Postulate Three. This leads to the following relationships:
T a T r e f T s y s .
Likewise, for pressure, we have:
P a P r e f P s y s
The TCI resolves a system’s total energy, Etot, into system energy and ambient-state energy:
E t o t = E + E a s E a s > 0 .
System energy, E, is defined relative to the ARS, with ambient-state energy, Eas, defined with respect to absolute zero temperature. The ambient-state energy is the minimum-energy equilibrium state for the system as it exists with respect to its ambient surroundings. The ambient temperature and ambient-state energy are objective thermocontextual properties of state, and they are always positive.
The TCI partitions system energy E into exergy X and entropic energy Q:
E = X + Q .
Exergy and entropic energy are defined by perfect measurements of work and ambient heat as a system reversibly transitions to its ARS (Figure 1). Exergy equals a system’s work and entropic energy equals ambient heat, as reversibly recorded by an external measurement device at the ambient temperature. The measurement device is a macroscopic classical device, and its record of results is independent of observation or any specific observer.
System energy can include external energy, comprising kinetic and potential energies with respect to an external reference frame. A system’s internal energy, U, is defined by subtracting external energy from its system energy:
U E E e x t .
System energy can also include thermal energy (heat). Thermal energy is defined with respect to the ARS, and is given by:
q = T a T C v d T .
A system’s internal energy is increased by the addition of work and heat:
d U = d w a + d q a
dwa and dqa are increments of work and heat that are added to the system from the ambient surroundings.
Just as equation (10) parses the system energy into exergy and entropic energy, we can parse internal energy into internal exergy and entropic energy:
U = X U + Q ,
where internal exergy Xu equals X−Eext. We can then resolve internal exergy into mechanical exergy and thermal exergy:
X U = X m + X q
Thermal exergy Xq is the idealized work potential of heat, and mechanical exergy Xm is the system’s work potential after thermally equilibrating with the ambient surroundings, at which point the system energy has no thermal energy. Mechanical exergy, such as the potential energy of a raised weight, is one hundred percent accessible for work. Mechanical energy is the energy of mechanics, and it is quantified by, and reversibly interchangeable with, work.
From elementary thermodynamics of the Carnot cycle and (12), thermal exergy is given by:
d X q = ( T T a ) T d q = ( T T a ) T C v d T ,
where dq is an increment of added thermal energy, T is its temperature of thermalization, Ta is the fixed ambient temperature, at which work is measured.
Combining (14) to (16), we get:
d U = d X m + d Q + d q T a T d q ,
and subtracting (13) from (17), we get:
d Q = T a T d q ,
where we used the equality between work and mechanical exergy. The TCI then defines entropy with respect to the ARS at the ambient temperature by:
S T C T a T d q T = T a T d Q T a = Q T a
Thermocontextual entropy is a generalization of the Third-Law entropy, which is defined relative to a hypothetical “ambient” reference at absolute zero temperature. The equalities in (19) are based on (18), and they relate entropic energy to entropy. It immediately follows from (19) and (3) that thermocontextual entropy and thermodynamic entropies differ only by their zero points, and that changes in the entropies are identical:
Δ S T C = Δ S G i b b s = Δ S 3 r d L
From (14) and (19), we get:
U = X u + T a S
where we drop the TC subscript for entropy.
Equation (21) is the thermocontextual generalization of the thermodynamic relationship:
U = F + T S ,
where U is internal energy and F is Helmholtz free energy. Exergy and free energy are closely related, but whereas TS and free energy are defined only for isothermal systems at the system temperature T, exergy and entropic energy are defined with respect to a fixed ambient temperature, Ta, and they are easily definable for non-isothermal systems.
Thermal energy, exergy, entropic energy, and entropy are all thermocontextual properties of state, defined with respect to the fixed ambient reference state, which specifies the ambient temperature and zero values for energy and entropy. They are well defined, and unlike thermodynamic properties, they are readily applicable to non-isothermal and far-from-equilibrium systems. Thermocontextual properties of state are summarized in Appendix A.

3.3. Energy States

A system’s energy state specifies the system’s exergy and entropic energy with respect to the ARS in equilibrium with the ambient surroundings. There are two requirements for an energy state:
  • The system is isolated from exchanges with the surroundings. Isolation means that it is fixed in energy and composition.
  • The system is non-dissipative. This means that there is no irreversible dissipation of exergy to ambient heat. It is thermodynamically reversible.
Requirements 1 and 2 fix the energy state’s composition, exergy, and entropic energy, but they do not fix the physical system’s microstate configuration. Multiple degenerate microstate potentialities can share the same energy state.
Figure 2 illustrates three classifications of energy states, based on whether their entropic energy content is positive, zero, or negative. Each energy state contains an ambient-gas cylinder and a mechanical “battery,” such as a spring or weight, to store mechanical exergy. Each state is prepared with ambient work wa equal to nkBTaln(2), which is the work needed to reversibly compress an ideal n-particle ambient gas to one half of its initial volume.
The thermal energy state (a) is created by the work of reversibly pumping ambient heat up-temperature into the gas cylinder, increasing its temperature and entropic energy. The applied work equals the gas’s thermal exergy, and the pumped ambient heat equals its positive entropic energy. The thermal exergy can be accessed for work by rejecting the entropic energy back to the ambient surroundings via a reversible heat engine.
The mechanical energy state (b) stores its energy as mechanical exergy in its battery, with zero entropic energy and zero randomness. The mechanical state energy has no entropic energy, and it is immediately accessible for external work, independent of the ambient temperature.
The configurational energy state (c) is reversibly created by the work of isothermally compressing the gas to half volume. The gas’s temperature does not change during isothermal compression, and the gas’s energy is therefore fixed. The gas’s gain in exergy from the work of compression is offset by an equal rejection of ambient heat. The exergy can be accessed for work by reversibly running the piston in reverse.
Each system’s energy state in Figure 2 is defined by the exchanges of work and ambient heat with an ambient measurement device as the system reversibly transitions back to the ARS. The three energy states in Figure 2 have the same exergy, equal to the applied work. They differ in whether their energy is mechanical-state energy Em, with Q=0, thermal-state energy Eq, with Q>0, or configurational-state energy Ec, with Q<0.
The energy states in Figure 2 are plotted in Figure 3 in X–Q space. The arrows in Figure 3 show the preparation of the states, starting from the zero-energy ambient state and applying work and adding or removing ambient heat. Measurement is simply the reverse transition to the ambient reference state, with a reversible record of the changes.
Figure 3. TCI Energy States and their Preparations in X−Q space. (a) The thermal state’s energy, Eq, has exergy equal to its work of preparation and positive entropic energy, Q, equal to the ambient heat reversibly pumped from the ambient surroundings. (b) The mechanical state’s energy, Em, has exergy equal to its work of preparation, wa, and zero entropic energy. (c) The configurational state’s energy, Ec, has exergy equal to its measurable work, wa, and negative entropic energy, equal to the ambient heat expelled during isothermal compression. (d) Measurements of the energy states are defined by the changes in measurable work and ambient heat as the systems reversibly transition to the ARS.
Figure 3. TCI Energy States and their Preparations in X−Q space. (a) The thermal state’s energy, Eq, has exergy equal to its work of preparation and positive entropic energy, Q, equal to the ambient heat reversibly pumped from the ambient surroundings. (b) The mechanical state’s energy, Em, has exergy equal to its work of preparation, wa, and zero entropic energy. (c) The configurational state’s energy, Ec, has exergy equal to its measurable work, wa, and negative entropic energy, equal to the ambient heat expelled during isothermal compression. (d) Measurements of the energy states are defined by the changes in measurable work and ambient heat as the systems reversibly transition to the ARS.
Preprints 121878 g003
Figure 3 highlights the fact that an energy state’s energy is not a simple scalar. It is two dimensional, spanning exergy and entropic energy. We denote 2D energy by boldfaced E:
E = X , Q .
The magnitude of 2D energy is equal to the simple sum of its scalar components, E=X+Q, as was given in (10). The classification of energy states based on entropic energy is summarized in Table 1.
The TCI objectively defines exergy, entropic energy, and energy states based on reversible measurements of work and heat with respect to the ARS. We note that the HCF, with its assumption of an absolute-zero ambient surroundings, does not recognize heat or entropic energy as fundamental, and it only recognizes mechanical energy states.

3.4. Microstates, Macrostates, and Information

Up to this point, the TCI has only addressed energy states. A system is more than just its energy state, however. The configuration of a system’s parts defines the system’s microstate, which is generally just one of a vast number of possibilities. Whereas the energy state is defined by perfect reversible measurement results (Figure 1), a system’s specific microstate configuration is described by its information. Measurement results are classical, and they are independent of observation, but information, by definition, is based on observations [2].
The TCI defines the thermocontextual microstate by the configuration of a system’s irreducible resolvable parts. A system’s resolution is limited by its thermal randomness at the ambient temperature, and it changes with changing ambient temperature. A perfect external observer is defined with perfect resolution at the ambient temperature and by complete information on the thermocontextual microstate. If the system’s ambient temperature changes, the thermocontextual microstate, the perfect observer’s resolution and its information necessarily change along with it.
Figure 4 (left side) illustrates the relationships among a system’s physical state, its energy state, its thermocontextual microstate, and its complete thermocontextual state description. The thermocontextual state description describes the energy state by reversible measurement with respect to the ARS, and it defines the system’s microstate by a perfect observer. The thermocontextual state description completely defines the system’s physical state.
The right side shows the same system’s macrostate-model description. The macrostate-model describes the energy-state with respect to a fixed reference temperature, Tref, and the system’s microstate by a fixed reference observer. By convention, we define the observer’s fixed reference by the system’s ambient temperature, and we define its fixed information by perfect ambient measurement. As initially defined, the fixed reference observer is a perfect observer. Its energy-state measurement and thermocontextual microstate description therefore define a complete macrostate model.
If the ambient temperature declines, however, the reference temperature, with respect to which the system is measured, is higher than the new ambient temperature, and measurement does not completely define the system’s energy state. In addition, if the system’s microstate randomly changes, then the fixed reference observer’s information is no longer complete. In either case, the macrostate model’s description is incomplete.
We can describe the macrostate-model’s incomplete microstate description statistically as a probability distribution, by:
P M S = P m s , 1 , P m s , 2 , P m s , N o b s
The macrostate is based on the reference observer’s knowledge, and Nobs is the number of resolvable microstates. The Pms,i are the Bayesian probabilities [4,13], describing the reference observer’s expectations that the system’s observable microstate is ms,i. The macrostate model is a superposition of the reference observer’s microstate expectations.
We define configurational entropy as a macrostate property describing the imprecision of the observer’s macrostate description. It is defined by:
σ c i = 1 N o b s P i ln P i ,
where the sum is over the system’s observable microstates. The Pi’s are the macrostate model’s Bayesian probabilities, which are assigned by the fixed-reference observer based on its fixed information. The configurational entropy is a macrostate property explicitly based on the reference observer’s incomplete information.
To illustrate and contrast the macrostate and the physical state that it describes, we consider a fair and random coin flip. The macrostate model transitions from a single initial known state of either heads or tails, with N=1 and PMS=[1], to a mixed macrostate following the coin flip, with N=2 and PMS=[½,½]. The macrostate model following the coin flip describes equal and superposed expectations for heads or tails configurations, and its configurational entropy increases from zero to ln(2).
Following the coin flip, the coin has a definite but unknown microstate, and its physical state is statistically described as PPS=[½,½]. It is equal to the macrostate’s probability distribution, PMS. However, whereas the macrostate model’s probabilities are expectation-based Bayesian probabilities, the physical state’s unknown microstate’s probabilities are frequentist probabilities [14]. Frequentist probabilities are defined by the statistics of repeated measurements of a random transition.
The coin’s physical state may be unknown, but it exists as a single definite microstate. Its actual microstate configuration is represented as:
P m s a = P m s , a = 1 , P m s , i m s , a = 0 ,
where msa is the physical state’s actual microstate configuration and ‘ms,a’ indicates the actual but unknown microstate of heads or tails.
The Kullback–Leibler divergence (DKL) [15] was introduced to provide a measure of the information gap between two probability distributions. The DKL divergence is defined by:
D K L ( P 1 | P 2 i = 1 N P 1 , i l n P 1 , i P 2 , i
The DKL information gap describes the relative entropy of state 1 based on the macrostate model 2. We apply (27) to the information gap between the reference observer’s macrostate model, PMS (24), as P2 and the system’s observable but unknown microstate, Pmsa (26), as P1. Substituting (26) and (24) for P1 and P2, the DKL divergence collapses to:
D K L ( P m s a | P M S = l n 1 P m s , a .
Pms,a is the reference observer’s Bayesian expectation probability for the actual observable, but unknown, microstate ‘a’. The DKL information gap ranges from zero for Pms,a equal to one, representing complete knowledge of the actual observable microstate configuration ‘a’, to infinity for Pms,a equal to zero, representing complete misinformation.
Equation (28) shows that as Pms,a approaches one hundred percent, the DKL information gap approaches zero, and vice versa. As Pms,a approaches one, the macrostate model’s entropy (25) also approaches zero, representing the reference observer’s complete description of the observable microstate. These relationships are summarized by:
D K L 0 P m s , a 1 σ c 0 .
A zero DKL implies a zero configurational entropy, but a zero configurational entropy does not imply a zero-information gap. A macrostate entropy of zero could have Pms,j equal to one for some microstate j not corresponding to the actual microstate, and Pms,a would equal zero, representing an infinite information gap. Configurational entropy describes a macrostate model’s precision, but precision does not mean accuracy. The DKL information gap, not configurational entropy, describes a macrostate model’s accuracy and a reference observer’s uncertainty of a system’s actual microstate configuration.

4. Transitions

We now shift from focusing on a system’s state to describing the transitions from an unstable state to a more stable state. The universe is in constant transition, and it is constantly dissipating and utilizing exergy for work.
A transition is described by transactional properties (Appendix A). From (6), (19) − (21), and the conservation of energy, a transition’s decline in exergy is described by:
X ˇ = Q ^ + w ^ a = k B T a σ ^ q + w ^ a .
X ˇ is the transactional decline in a system’s exergy per transition; w ^ a is the utilization of exergy for external work on the ambient surroundings; and σ ^ q and Q ^ are the irreversible productions of thermal entropy and ambient heat by dissipation. Equation (30) describes the per-transition changes in the energy state for a system with a fixed ambient surroundings. A state transition can involve more than just internal energy-state changes, however. It can involve changes in its ambient surroundings or changes in the system’s microstate configuration. A change in either is a change in the system’s thermocontextual state. We need to generalize (30) to describe general transitions, and we need to update TCI’s original Postulates Four and Five, which were limited to energy-state transitions for systems with fixed ambient surroundings [12].

4.1. Updated Postulates of Transitions

Before updating TCI’s Postulates Four and Five, we need to introduce two new macrostate properties (Appendix A). Configurational energy is defined as the portion of a system’s exergy that is inaccessible to the reference observer for work. A system’s exergy could be inaccessible either due to incomplete information on its observable microstates, or due to an elevated reference temperature of measurement, Tref>Ta, or both. Configurational energy and its mechanical and thermal components are defined by:
C = C m + C q k B T r e f D K L + T a T r e f T T a T C v d T
C = k B T r e f D K L + C v T r e f T a T a ln T r e f + T a ln T a
Cm describes the mechanical exergy that is not available to the observer due to incomplete information on the thermocontextual microstate, as described by its DLK information gap. Cq describes the thermal exergy that is not available to the observer due to its reference temperature of measurement being greater than the ambient temperature.
The balance of exergy is accessibility. Accessibility is the portion of a system’s exergy that is accessible to the fixed reference observer for work at the reference temperature. Accessibility is defined by:
A X C = X m C m                           + X q C q          
A = ( X m k B T r e f D K L ) + X q C v T r e f T a T a ln T r e f + T a ln T a
A = ( X m k B T r e f D K L ) + C v T T r e f T a ln T + T a ln T r e f ,
where exergy, availability, and configurational energy are all broken into their mechanical and thermal components, and thermal exergy Xq is expanded by integration of (16) from the ambient temperature to system temperature. The partitioning of exergy into accessibility and configurational energy is illustrated in Figure 5.
If the observer is at ambient temperature, and if, in addition, its actual configuration is precisely known, then its DKL information gap is zero, and from (34), the internal exergy is entirely accessible (A=Xm+Xq=X). If the observer’s reference temperature is greater than the ambient temperature or if the reference observer’s information is incomplete, then the internal exergy is only partially accessible for work (A<X).
We next define utilization as a transactional property, by the sum of external and internal work per transition:
υ ^ w ^ r e f + A ^ i n t .
Utility (υ) was introduced in [12], but it is recast here as transactional utilization ( υ ^ ). Utilization is the sum w ^ r e f , which is the external work (and accessible energy output) to the fixed reference, plus A ^ i n t , which is the internal work of increasing the system’s accessible exergy change per transition.
With macrostate and transactional properties defined, we can now introduce TCI’s updated Postulates Four and Five for general transitions. They update TCI’s original postulates four and five by replacing exergy with accessibility. This extends their application to include changes in a system’s physical surroundings, or to declines in the reference observer’s information gap with increasing knowledge of the system.
Updated Postulate Four: 
(Stability of states—The Minimum Accessibility Principle): A state with positive accessible exergy is unstable and has a potential to spontaneously transition to states with lower accessible exergy.
The updated Postulate Four defines the potential to spontaneously transition based on accessibility. A state has a spontaneous potential to transition to a state of lower accessible exergy and higher stability, as defined by a fixed reference observer and measurement framework. Maximum stability is defined by equilibrium with the ambient surroundings, with zero exergy and minimum accessibility. A non-equilibrium system can persist as a metastable state as long as it remains kinetically locked, but its potential to approach equilibrium remains, and this potential defines the thermodynamic arrow of time.
Updated Postulate Five: 
(Stability of transitions—The Maximum Utilization Principle): A transition seeks to maximize its production of work and accessibility.
Whereas Postulate Four addresses the stability of states, Postulate Five addresses the stability of transitions. The updated Postulate Five states that the transition with the highest utilization is the most stable.

4.2. Transitions, Dissipation, and Dispersion

From equations (30) − (35), we can express a general system’s loss in accessibility by:
A ˇ = k B T a σ ^ q + D ^ K L + w ^ r e f
A ˇ is the decline in accessible exergy due to an increase in thermal entropy (dissipation), an increase in DLK information gap (dispersion), and work output to the fixed reference. Equation (37) generalizes (30) for application to changes in the system, the ambient surroundings, or the reference observer’s knowledge.
The relationship between accessibility and the DKL information gap can be clarified by considering the configurational energy state from Figure 2(c). The reversible work to create the energy state in 2(c) could just as well have compressed the gas from the left or the right, as shown by the two configurations in Figure 6. The number of thermocontextual microstate configurations resolvable by the system’s perfect observer is N=2. The compressed gases in each of the two configurations share the same energy state, and they are degenerate. They have the same exergy, equal to the work wa=nkBTaln(2) of reversibly compressing an n-particle ambient gas to one-half volume.
Figure 6 illustrates three basic types of quasistatic transitions for the compression of the ambient gas to one-half volume, as described by a fixed ambient observer. A quasistatic transition proceeds at a negligible rate, so that dissipation due to friction can be neglected. The transitions all have the same reversible exchanges of work and ambient heat, but they differ in their exergy and information changes, as shown by the differences in σ ^ q and D ^ K L in the figure’s lower table. The upper table shows the exergy, accessibility, entropy, and DKL information gap for the system following each transition.
Transitions L and R utilize the work of input to reversibly and deterministically compress the gas from the left or right side to produce a definite configuration. The L and R configurations are known, and their DKL information gap is zero. There is zero loss of information, and this is the definition of determinism. The work of compression is reversibly stored and fully accessible, and so X=A=wa, and from (6) and (19), the thermal entropy is σq=Q/kBTa=−nln(2). These results are summarized in the upper table.
There is no production of thermal entropy, no loss of information, and no increase in the DKL information gap. Zero production of thermal entropy means no dissipation, and this is the definition of reversibility. The quasistatic transition from ambient state to L or R is reversible and deterministic, and this defines an equilibrium transition, as summarized in the lower table in Figure 6.
Transition M takes the work of input and produces a mixed macrostate, MLR, with a definite but unknown microstate configuration. The ambient observer’s information on the ambient gas is initially complete, but following the gas’s compression, it is incomplete. Transition M is statistical, and it produces configurational entropy σ ^ c = l n ( 2 ) , based on (25) and two equal-probability configurations. The production of configurational entropy and the contrapositive of equation (29) indicate an increase in the DKL information gap, and from (37), a loss of accessible exergy. The loss of accessibility is related to the loss of information; without knowing the configuration of the compressed gas, its exergy cannot be accessed for work with perfect efficiency and certainty. From Postulate Four, the loss of accessibility means that the process of producing configurational entropy and increasing the information gap is spontaneous.
The mixed macrostate’s definite configuration is unknown, but by Postulate Three, it is not hidden, and it is knowable to a perfect observer. The transition therefore preserves exergy. There is no irreversible production of thermal entropy, dissipation, or thermalization, and the transition is therefore reversible. The M transition to the mixed macrostate is statistical and spontaneous, but it is also reversible, as summarized in the lower table in Figure 6.
Transition Q applies the work of input to the zero-entropy ARS, which transitions to a thermalized configuration, QLR. The L and R configurations exist as a “pure” thermalized microstate configuration. The transition is from a single ambient configuration to a single thermalized configuration, and this makes it deterministic. Determinism means that there is no increase in uncertainty or DKL information gap. Thermalization, however, means that the transition produces thermal entropy, and it is irreversible.
The transition from the zero-entropy ambient microstate to the thermalized microstate QLR in Figure 6 reflects the irreversible production of thermal entropy ( σ ^ q = l n ( 2 ) ). From (37), the production of thermal entropy results in the irreversible loss of exergy and a loss of accessible exergy. With a loss of accessibility, the transition Q is deterministic, irreversible, and spontaneous, as summarized in the lower table in Figure 6.
The transactional properties of the three quasistatic transitions are summarized in the lower table in Figure 6. The quasistatic Eq transitions are reversible, deterministic, and equilibrium. The M and Q transitions are both spontaneous, with loss of accessible exergy. For the M transition, the loss of accessibility is due to the loss of information by dispersion, and for the Q transition, the loss of accessibility is due to the production of thermal entropy and the dissipation of exergy. An immediate consequence of (37) and Postulate Four is that a system has a spontaneous potential to dissipate exergy, to disperse information, or both.
The dissipation of exergy by irreversible entropy production is precisely the subject of the Second Law of thermodynamics. The Second Law and TCI’s original Postulate Four (minimum exergy), are both about Q-type transitions and the irreversible dissipation of exergy. In contrast, the spontaneous dispersion of information is precisely the subject of MaxEnt [4,7]. MaxEnt is about M-type transitions to mixed macrostates, and the spontaneous and statistical, but reversible, dispersion of microstate configurations. Whereas dissipation produces thermal entropy, dispersion increases the DKL information gap.
An M-type transition increases a system’s information gap with respect to a fixed reference observer, but a system can only change until it reaches equilibrium with its ambient surroundings. At equilibrium, the mixed macrostate can exist in any one of multiple possible configurations, but they all have zero exergy, and the fixed reference observer has zero information on the system’s actual microstate. With zero information, the reference observer assigns all microstates equal unbiased expectations of 1/N, and the DKL information gap (27) immediately reduces to:
D K L ( P μ S a | | [ 1 N , 1 N , 1 N ] ) = l n N = σ m a x .
At equilibrium, the ambient reference observer has zero information, and the DKL information gap reduces to the maximum theoretical configurational entropy, regardless of which microstate configuration (26) the physical state actually transitioned to.
The implications of Postulate Four are expressed by three corollaries:
Corollary 4-1 
: The Second Law of Thermodynamics—thermodynamic entropy is irreversibly produced and can never decline.
Corollary 4-2 
: The Minimum Exergy Principle (TCI’s original postulate four)—Given a fixed ambient temperature, exergy is irreversibly dissipated to ambient heat.
Corollary 4-3 
: MaxEnt−A non-dissipative system with fixed ambient surroundings has a spontaneous potential for dispersion to increase the DKL information gap and to drive configurational entropy to its maximum.
Corollaries 4-1 and 4-2 describe the irreversible dissipation of Q-type transitions. Corollary 4-1 (the Second Law) states that thermodynamic entropy irreversibly increases, and Corollary 4-2 states that exergy is minimized. Corollary 4-3 describes M-type transitions, which are statistical and spontaneous but reversible, in principle. It states that spontaneous dispersion drives an increase in the DKL information gap, but only up to the point of equilibrium and maximum configurational entropy.

4.3. Efficiency and Refinement

Postulate Four addresses the stability of states. In this section, we apply Postulate Five to address the stability of transitions and processes, with both fixed and changing environments. Postulate Five states that the most stable transition has the highest utilization. Equation (37) showed the accessible exergy decline per transition is related to: (1) work output, (2) thermal entropy production (dissipation), (3) ambient temperature, and (4) the increasing DKL information gap (dispersion). We introduce four corollaries of Postulate Five addressing the stability of transitions based on each of these factors.
Corollaries 5-1 and 5-2 apply to stationary dissipative processes with fixed ambient temperature and information gap. Stationary does not mean static, however, and a stationary dissipative system is not a state. A stationary system’s time-averaged properties of state are constant, but they can fluctuate around fixed averages, and dissipative systems have positive flows of components and energy.
Crecraft [12] referred to stationary dissipative systems as homeostates. They are also referred to as nonequilibrium stationary states (NESS) by Ribó and Hochberg [16,17], and as flow networks by Robert Nivens et al. [18,19]. Flow networks, NESS, and homeostates all describe a stationary dissipative system as a network of transitions. Figure 7 illustrates a homeostate as a network of directed component links connecting irreversible transition nodes. Components have well-defined state properties, but as they transition, their exergy is partially dissipated to heat and partially output as work or exergy to the environment or to other nodes within the network.
A dissipative system’s net efficiency is equal to the ratio of external work output to exergy input, and it cannot exceed one hundred percent. For a network of dissipative nodes, however, the system’s total work includes the internal work on other network nodes. A stationary system’s total efficiency is defined by the ratio of total work to exergy input. Autocatalytic loops can recycle energy internally and yield total work efficiencies exceeding one hundred percent [12]. In the limit of zero dissipation, the total efficiency of the dissipative network in Figure 7 approaches two hundred percent.
The non-linearity of far-from-equilibrium systems allows for multiple homeostates and dissipative processes. Each path in Figure 8 represents a different process and a distinct homeostate that can exist for a given stationary environment. The dissipative processes and homeostates can have different network structures, flow rates, and efficiencies. Figure 8 also shows the observation and measurement of the homeostate transitions by a reference observer and its measurement device at the model’s reference temperature.
Each path in Figure 8 represents a stationary dissipative process and homeostate with high-exergy inputs and low-exergy wastes. Each homeostate path generally has multiple transitions and intermediate states. Internal work is the work by nodes on components within the dissipative network (Figure 7). Internal work is based on components flows and accessibility increases observable by the reference observer. Postulate Five calls for a stationary dissipative system to increase its utilization by increasing its internal and external work per unit of transition.
A system can increase its utilization in two ways: (1) by growth (or replication), or (2) by increasing efficiency. When resources are available, growth happens. Fires spread, and bacteria multiply. This is expressed by Corollary 5-1:
Corollary 5-1: The Growth Principle 
. A dissipative system expands to its stationary environment’s carrying capacity.
As a system expands, it increases its rates of exergy input, dissipation, and work. A larger system has a higher rate of work compared to a smaller system, and a larger stationary system is therefore always more stable, creating a spontaneous drive for growth.
There is always a limit to sustainable growth, however, due to finite resources. A mature ecosystem, for example, is a stationary homeostate in which its various species and the environment are in dynamic balance. The ecosystem is at the environment’s carrying capacity, and it cannot expand. The system can continue to increase its work rate and evolve, however, by reducing its dissipation and increasing its total efficiency. This is expressed as Corollary 5-2:
Corollary 5-2 
: The Maximum Efficiency Principle (TCI’s original postulate five). A dissipative system with a stationary environment spontaneously transitions to a stationary process of higher total efficiency.
The next corollary is thermal refinement. It applies to a fixed-temperature, non-dissipating (fixed mechanical exergy), and non-dispersive (fixed DKL) system with changing ambient surroundings. Differentiating (34), the change in accessibility with ambient temperature is given by:
d A d T a = C v l n T r e f T
If the system and its ambient surroundings are initially in thermal equilibrium, then the initial ambient temperature and system temperature, T, are equal. From (7), the fixed reference temperature and system temperature are also equal, and from (39), d A d T a equals zero. For a system initially in thermal equilibrium, a change in the ambient temperature has no effect on accessibility, and from Postulate Five, there is no potential for its change.
If the system and ambient temperature are not initially equilibrated, then from (7), there exists a fixed-reference temperature and ambient temperature less than the system temperature, and d A d T a is less than zero. If a system is not thermally equilibrated with its ambient surroundings, then from (39), a decline in the ambient temperature increases the system’s accessibility. From Postulate Five, the ambient temperature therefore has the potential to spontaneously decline.
The TCI recognizes thermal refinement as a special-case transition that increases a system’s accessibility by lowering the system’s ambient temperature. Thermal refinement is a special case of Postulate Five, as expressed by:
Corollary 5-3 
: Thermal RefinementFor a non-equilibrium system with positive thermal energy, its ambient temperature has a spontaneous potential to decline, resulting in an increase in a fixed reference observer’s access to the system’s thermal exergy.
Corollary 5-3 provides a drive for a dissipative system to discharge waste heat to a lower-temperature environment, thereby increasing the accessibility of a system’s thermal energy for work.
Thermal refinement does more than extract exergy from randomized entropic energy, however; it derandomizes a system’s thermalized microstate potentialities. Interaction with the newly cooled surroundings triggers random selection of a thermalized microstate potentiality and instantiates it as a definite and observable microstate configuration with accessible exergy.
The fourth and final corollary is configurational refinement of a fixed energy state, with zero dissipation and a fixed ambient temperature. The energy state has a fixed ambient temperature and fixed thermal entropy. As noted in §3.3, an energy state can have any of multiple different microstate possibilities, all consistent with the system’s physical and boundary constraints. The reference observer’s knowledge of the energy state’s macrostate is expressed as a probability distribution over the available microstate potentialities (24). Each probability expresses the observer’s expectation that a given microstate potentiality will be instantiated upon measurement.
The reference observer’s macrostate model changes with changes in information by observations (or memory loss). As shown by (37), for a non-dissipative system with σ ^ q = 0 , a narrowing DKL information gap (28) provide an external observer or agent greater access to the system’s exergy for work. This is expressed by Corollary 5-4:
Corollary 5-4 
: Configurational Refinement. An external observer or agent has a spontaneous potential to reduce its DKL information gap with a positive-exergy system to increase its access to the system’s exergy for work.
Whereas Corollary 4-3 drives a system to increase its configurational entropy, Corollary 5-4 provides an important counter to this by driving an external agent to spontaneously narrow its information gap. This enables it to have greater accessibility to the system’s exergy, in compliance with Postulate Five. Configurational refinement means a narrowing information gap, and this is the informational analogue of thermal refinement.
The mandate of Postulate Five is to increase a transition’s utilization (36). Corollaries 5-3 and 5-4 provide the drive to reduce uncertainty through thermal and configurational refinement and to increase a system’s accessible exergy. Corollaries 5-1 and 5-2 maximize utilization of that accessible energy first by expanding to a stationary environment’s carrying capacity, and then by maximizing efficiency.

5. Applications

This section discusses the applications of Postulates Four (minimum accessibility of state) and Five (maximum utilization of process) and their corollaries. Postulate Four’s Corollary 4-1 is the Second Law of thermodynamics. As famously quoted by Albert Einstein [20],“It is the only physical theory of universal content, which I am convinced, that within the framework of applicability of its basic concepts will never be overthrown.” The Second law has been thoroughly vetted, and it needs no further discussion.
Corollary 4-2 was introduced in [6] as the Minimum Exergy Principle and as TCI’s original postulate four. It is closely related to the Second Law of thermodynamics, and it states that exergy is irreversibly dissipated.
Corollary 4-3 recognizes MaxEnt as a special case of Postulate Four. The TCI recognizes configurational entropy and MaxEnt as distinct from thermodynamic entropy and the Second Law of thermodynamics. The Second Law expresses irreversible dissipation and the production of thermodynamic entropy; MaxEnt expresses the spontaneous dispersion of information and production of configurational entropy. Section 5.1 applies MaxEnt to the double-slit experiment to explain the well-documented but previously unexplained results, including random symmetry breaking.
Postulate Five’s Corollary 5-1 (growth principle) simply says that given an opportunity, a dissipative system will expand to the carrying capacity of its environment. Growth increases the system’s rate of work, and it is a special case of Postulate Five. The spontaneous potential for growth is a widely recognized phenomenon, but it has commonly been misinterpreted as consequence of a proposed principle of maximizing the rate of entropy production [21-23]. Rates of entropy production and work are both correlated with size and transition rate, but the drive for growth is driven by the increase in work rate, not the rate of entropy production. Whirlpools, for example, have a higher rate of total work but a lower rate of entropy production [12]. Their spontaneous formation provides an important counterexample of the maximum entropy production principle.
Corollary 5-2 was introduced in [12] as TCI’s original postulate five, the Maximum Efficiency Principle. Given a fixed environment, Corollary 5-2 preferentially selects higher-efficiency dissipative processes. The article illustrated the principle with numerous applications, based on well documented examples of spontaneous self-organization, including whirlpools. The maximum efficiency principle defines the arrow of increasing functional complexity.
Given a stationary environment, Corollaries 5-1 and 5-2 provide two paths for a dissipative system to increase its rate of work. When resources are available, systems expand to their environment’s carrying capacity. When resources become constrained, the drive for continued growth leads to competition for limited resources, and at some point, continued growth becomes unsustainable. The system can continue to increase its rate of work, however, by increasing efficiency through cooperation and increasing the organization of dissipative networks.
Corollary 5-3 addresses thermal refinement. It states that a system has a spontaneous potential to reduce the temperature of heat discharge to increase the accessibility of its energy for work. For a dissipative system, thermal refinement slows or even reverses the decline in accessible exergy mandated by Postulate Four. If the 2.5K cosmic microwave background (CMB) temperature is taken as the ambient temperature of the universe, then we can recognize cosmic expansion, which reduces the CMB temperature, as a mechanism of thermal refinement, in compliance with Corollary 5-3.
Thermal refinement also leads to instantiations of new microstate configurations. During the early universe, when the universe cooled to about 2×1012 K, the universe became unstable with respect to both baryons and antibaryons, but the transition almost exclusively produced baryons [24]. The asymmetrical transition to baryons and matter over antimatter is referred to as baryogenesis, and it is considered one of the outstanding problems of modern physics. Crecraft showed that Postulate Five promotes synchronization of parallel transitions to reduce dissipation and increase work output [12]. The synchronized instantiation of either baryons or antibaryons over a mixture of the two would reduce dissipation through mutual annihilation, and it would increase the work of particle creation, in compliance with Postulate Five. The instantiation of matter over antimatter may simply have been a spontaneous synchronized process of random symmetry breaking to increase the transition’s efficiency.
Corollary 5-4 can explain the spontaneous decline in configurational entropy of the SARS-CoV-2 virus documented by Vopson and Lepadatu [1]. The virus has the role of external agent, and the virus’s target cells are the system. The change in the virus’s RNA reflects a closing of its DKL information gap between information encoded in the virus’s RNA and the information encoded in the nucleotides of the virus’s target cells. Narrowing the information gap increases the virus’s access to its target’s energy. In the case of the virus, the narrowing information gap is achieved through random mutations and selection. Corollary 5-4 provides a selection criterion for the virus to favor mutations that enable it to access its target’s energy and to increase its work of reproduction. From (29), the decline in the DKL information gap means a decline in the configurational and information entropy, as described by Vopson and Lepadatu’s original law of infodynamics.
We note that Corollary 5-4 does not apply to the decline in entropy for Vopson and Lepadatu’s first example with the magnetic storage device. Corollary 5-4 applies to non-dissipative M-type transitions, but thermalization of the magnetic storage medium is a Q-type transition. Increasing thermalization reduces a perfect observer’s resolution and the number of resolvable microstates. From (35), this reduces the configurational entropy. However, thermalization is a dissipative process, and it leads to spontaneous decreases in the system’s exergy and accessibility. The magnetic storage device’s entropy decline in their first example is instead a consequence of Corollary 4-1 (Second Law of thermodynamics).
Corollaries 5-4 and 4-3 together underpin Bayesian statistical modeling and analysis, which has become a powerful tool for analyzing complex systems [25]. E. T. Jaynes showed that maximizing entropy produces an unbiased best-fit model, based on available information [4]. From Corollary 4-3, a system naturally maximizes the configurational entropy of its macrostate model, and this is the first key for the application of Bayesian statistical analysis. From Corollary 5-4, an external agent has a spontaneous drive to improve its macrostate model’s accuracy, as quantified by minimizing the DKL information gap. This is the second key for Bayesian analysis. By successively updating of the system’s macrostate model with new information and allowing it to maximize its entropy, an observer (researcher) can progressively improve its macrostate models of complex systems.
Recent applications of Bayesian statistical modeling include problems in astrophysics [26], rapid medical diagnostics [27,28], thermodynamic computing [29,30], artificial intelligence and machine learning [31,32], ecological modeling [33], macroeconomics [34], imaging theory and applications [35-37], network analysis [18,19,38], and plasma science [39].
A final application of Corollary 5-4 is discussed in Section 5.2. It illustrates a simple model for the abiogenic origin of self-replicating nucleotides.

5.1. MaxEnt and the Double Slit Experiment

In the double slit experiment, a quantum of energy is emitted from a source as a particle, and it is detected as a particle by its point-like impact on a detector screen [40]. If a partition with double slits is placed between the source and detector screen, the accumulated impacts display an interference pattern, characteristic of waves, even when particles are transmitted one at a time. If a “which-slit detector” (WSD) is inserted behind the slits and activated, it interacts with the quanta, and the wave interference pattern disappears. Richard Feynman famously described the double-slit experiment as the “only mystery,” “which has in it the heart of quantum mechanics.” [41]. The TCI and the following discussion, updated from [42], offers an important new insight into this behavior.
Figure 9 illustrates the double-slit experiment. A quantum of energy is emitted as a definite particle, and it transitions to a mixed macrostate comprising one of many possible point-like configurations, each recorded as an impact on a detector screen (e.g., point B). Between the points of emission and detection, the particle is in an M-type transition from a definite microstate configuration to a mixed macrostate of higher configurational entropy. With the WSD deactivated, multiple transitions generate a statistically mixed macrostate with a wave-like interference pattern, represented by the probability distribution profile C in Figure 9. The profile represents the cumulative effect of individual transitions and the record of multiple instantiations of definite microstate configurations.
With the WSD activated, a definite particle passes through one slit or the other without any interference. Why a particle exhibits wave interference with no WSD, but no interference with a WSD, is the mystery to which Richard Feynmann referred. The particles’ exergies are assumed to be completely dissipated at the detector, so all transitions have zero utilization (36) and they are all equally probable under Postulate Five. The only difference between the overall transitions with and without the WSD is the probability distributions of their final mixed macrostates, as recorded at the detector screen.
Figure 10 shows intensity profiles for the interference and diffraction of light. The profiles are calculated from the equations given in [43], using parameters shown in Table 2. The intensity profiles also represent probability distribution profiles for the detection of individual photons. All profiles are normalized to one and are mapped over a span of 400 possible configuration points, based on the model’s detector width and resolution (Table 2).
The red profile in Figure 10 shows the profile for wave interference by the double slits. It describes the probability distribution for individual particle transitions without a WSD. The green and blue profiles show the profile for wave diffraction by a particle from the left or right slit. The purple profile shows the normalized sum of the blue and green profiles. The purple profile describes the probability distribution with the WSD activated, for individual particle transitions from source to detector through one slit or the other.
If the WSD is deactivated, the physical results conform to the red profile’s macrostate model as a wave passing through the slits and producing an interference probability distribution. If the WSD is activated, the results conform to the purple profile’s macrostate model’s two-step transition.
Table 3 shows the calculated configurational entropy changes for each transition based on the probability distribution profiles and equation (25). Row one shows the entropy for wave interference (red profile), in the absence of the WSD. The entropy change with no WSD is 4.69. The remaining rows show the configurational entropy changes for the two-step transition with the WSD activated.
Row two shows the configurational entropy change for the first transition from the source to a definite particle at one of the two slits. The entropy change is ln(2)=0.69, reflecting the particle’s two equal-probability microstate configurations at the slits. Row three shows the entropy change for the second transition, from the definite particle at one of the slits to the detector screen. Its probability distribution is shown in Figure 10 by either the green profile or the blue profile. The entropy for each profile and transition configurational entropy is 5.02. The configurational entropy for the overall two-step transition from the source to the detector screen with the WSD is 5.71 (row 4), equal to the sum of the two steps. This is greater than the configurational entropy for the single-step transition without the WSD (row 1).
With no WSD, the double slits’ symmetry imposes symmetry on macrostate model. The particles pass through the pair of slits symmetrically, with wave-like interference. This is represented by the red profile, with a configurational entropy of 4.69. With the WSD activated, the double slits’ symmetry is broken, and the particle has an opportunity to pass asymmetrically through one slit or the other. The asymmetrical transition’s overall two-step transition is represented by the purple profile, with a configurational entropy of 5.71. With the WSD activated, the system spontaneously breaks its symmetry by selecting one slit or the other to produce the higher-entropy no-interference macrostate, in compliance with MaxEnt (Corollary 4-3).

5.2. Configurational Refinement and Replication

In a groundbreaking experiment in 1952, Stanley Miller and Harold Urey demonstrated that energy input to a mixture of simple gases can produce an array of amino acids [44]. More recently Karo Michaelian and his colleagues have revealed detailed kinetic steps for the synthesis of various complex molecules from simple inorganic precursors by the interaction with ultraviolet light [45]. The abiogenic creation of complex molecules from the action of energy on chemical precursors appears to be a common phenomenon, and it is an important step for the origin of life. The kinetics of abiogenic creation of self-replicating nucleotides, which are the key to the origin of life, however, is more problematic. The abiogenic origin of self-replicating nucleotides is an active area of investigation [46], but there has been little success in finding a general principle to explain it.
This section provides insight into self-replication based on Corollary 5-4, which provides a spontaneous potential for an external agent to acquire information on a system. To illustrate this, we consider a toy M-type process, shown in Figure 11. The figure illustrates the assembly of a statistical array of A and B monomers by doing work of taking monomers from the ambient surroundings and adding them to a monomer array. The produced monomer array has a definite, but unknown, configuration of A’s and B’s. We initially assume reversibility, so there is no dissipation, no thermalization, and no random rearrangements of monomers once they are added.
If we assume equal work for adding an A or B to the array, the Boltzmann-Gibbs function describes the addition of A or B at any point as equally probable. If an observer has no additional information, then the macrostate model is the mixed-state array Pms= [½N, ½N …½N] of length 2N, with a configurational entropy and DKL information gap equal to Nln(2). The array has a positive exergy, equal to NΔw, but its accessibility is less than this. From (35), the array’s accessibility is given by:
A = X C = N Δ w k B T a D K L
From equation (40), reducing the array’s DKL information gap increases its accessibility of energy for work. Corollary 5-4 therefore provides an external observer with a spontaneous potential to acquire information and reduce the system’s DKL information gap.
One way to reduce the DKL information gap is to create a template that can catalyze the creation of a known sequence. A template is essentially a copy or mirror image of an array to be replicated. Given a template and a procedure that can use it, the template can be used and reused to create an array with a known sequence, with zero DKL information gap. From (40), a DKL information gap of zero maximizes the array’s accessible exergy to equal its exergy. Another way to increase the transactional output of accessible exergy is by increasing the array length. Given an appropriate source of energy, Corollary 5-4 provides the spontaneous potential to create a template and procedure to replicate arrays of increasing greater length, exergy, and information, and thereby increasing access to the source’s energy.
The toy model in Figure 11 ignores dissipation and thermalization. In reality, dissipation cannot be eliminated, and thermalization inevitably results in random mutations in the transcription of the template. However, Postulate Five’s Corollaries 5-1 and 5-2 drive the transcription process to increase the transactional output of accessibility by reducing dissipation and random errors from thermalization.
The kinetics for the abiogenic origin of life are exceedingly complex, but Postulate 5 provides a spontaneous potential and a general principle for reproducing increasingly large amounts of accessible exergy and information. It provides the essential selection criterion for guiding the evolution of self-replicating nucleotides to increase their information content and to reduce random transcription errors and mutations.

6. Summary and Conclusions

The thermocontextual interpretation (TCI) was initially introduced as a generalization of Hamiltonian mechanics. The TCI describes systems with respect to an ambient reference state in equilibrium at a positive ambient temperature of the surroundings [6]. It recognizes exergy, entropy, irreversibility, and randomness as fundamental physical properties of states and transitions.
The TCI is updated here to address general transitions, which can include changes in a system’s ambient surroundings. To address these transitions, the TCI describes a system’s macrostate model, which is defined for a fixed reference observer and measurement framework. Configurational entropy and the DKL information gap are macrostate properties describing, respectively, the observer’s imprecision and inaccuracy in its description of a system’s microstate configuration. The TCI also introduces the macrostate property, accessibility, which is exergy accessible for work by a system’s external observer.
TCI’s updated Postulate Four defines the stability of states by minimizing a state’s accessibility. It states that a system has a spontaneous potential to decrease accessible exergy for work by a fixed reference observer. Postulate Four’s Corollary 4-1 is the Second Law of thermodynamics, which describes the irreversible increase in thermodynamic entropy. Corollary 4-2 is related to the Second Law, as described by the TCI’s original Postulate Four [6]. Both describe irreversible dissipation of exergy. Corollary 4-3 is MaxEnt, which is the statistical mechanical interpretation of the Second Law. It describes the spontaneous production of configurational entropy and the increase in an observer’s uncertainty due to dispersion by classical chaos or wavefunction collapse. Section 5.1 applies MaxEnt to the quantum double-slit experiment to explain why particles spontaneously choose one slit or another and avoid generating an interference pattern when there is a “which-slit” detector in place. The TCI’s Postulate Four fully embraces both MaxEnt and the Second Law as fundamental physical principles, and it provides a unified foundation for mechanics and thermodynamics.
Whereas Postulate Four states that dissipation and dispersion reduce a system’s accessible energy, Postulate Five states that the most stable transition maximizes its output of work and accessible energy. Corollary 5-1 (the growth principle) states that a dissipative system has a spontaneous potential to expand to the system’s environment’s carrying capacity. The maximum efficiency principle (Corollary 5-2) provides a drive for stationary dissipative systems to spontaneously self-organize, and it defines the arrow of functional complexity. Corollaries 5-1 and 5-2 describe two paths by which a dissipative system with a stationary environment can maximize utilization of its available resources.
Other special cases of Postulate Five address the stability of non-stationary transitions that are associated with a changing ambient temperature (Corollary 5-3) or associated with the observer’s gain in information (Corollary 5-4).
Corollary 5-3 (Thermal refinement) describes the spontaneous increase in accessible exergy and the random instantiation of microstate potentialities due to a declining ambient temperature. Cosmic expansion and the symmetry-breaking production of matter over antimatter during the early universe are two outstanding problems of physics. Thermal refinement provides a simple explanation for both cosmic expansion and the predominance of matter over antimatter.
Corollary 5-4 (configurational refinement) counters MaxEnt’s push for systems to increase their configurational entropy. It describes an external agent’s spontaneous potential to acquire information on an existing or potential energy source. Acquiring information reduces the information gap; it increases the accuracy of the agent’s description of the system’s actual microstate configuration; and it increases the agent’s access to the system’s energy for work. Corollary 5-4 is a reformulation of Vopson and Lepadatu’s Second Law of Infodynamics [1]. It addresses the accuracy of information, and it is a much stronger statement than the original statement of the Second Law of Infodynamics, which focused on statistical entropy and the precision of information. Section 5.3 describes a simple model to illustrate how Corollary 5-4 can lead to the evolution of self-replicating arrays with increasing exergy, information, and transcription fidelity.
The thermocontextual interpretation represents an alternative to the existing Hamiltonian conceptual framework of mechanics, and it can resolve many of the outstanding unresolved questions of physics, chemistry, and biology. By its very general nature, it is anticipated that it will find applications to a wide range of other fields, including economics and social sciences, as well.

Funding

This research received no external funding.

Acknowledgments

In this section, you can acknowledge any support given which is not covered by the author contribution or funding sections. This may include administrative and technical support, or donations in kind (e.g., materials used for experiments).

Conflicts of Interest

The author declares no conflicts of interest.

Appendix A. State, Macrotate-Model, and Transactional Properties

Physical State and State Properties: Based on perfect measurement and observation
Ambient Temperature and pressure Ta and Pa The minimum temperature and pressure of the system’s physical surroundings with it could interact.
Ambient Surroundings Idealized equilibrium surroundings at Ta and Pa
Ambient Reference State ARS State in thermodynamic equilibrium with ambient surroundings defines zero values for thermocontextual state properties.
Thermocontextual
microstate
Complete description of the system’s mechanical configuration of irreducible parts resolvable by a perfect observer. A system’s resolution is limited by thermal randomness at Ta.
Perfect observer It has perfect resolution at the ambient temperature and complete information on the thermocontextual microstate. Observer and information change with changing system and Ta.
Energy state Defined by perfect macroscopic (classical) measurement of temperature, energy, and exergy with respect to the ARS
Physical State Defined by the energy state plus the thermocontextual
microstate
Thermocontextual Macrostate Model Complete description of a system’s physical state by perfect measurement and observation.
Total Energy Etot Total energy relative to reference state at absolute zero (0 K)
Volumetric heat capacity C v = q T v Change in a system’s thermal energy with change in temperature at fixed volume.
Ambient-state energy Eas= 0 T a C v ( T ) d T Ambient reference state’s energy with respect to zero kelvin.
System Energy E=Etot−Eas=X+Q System Energy with respect to ambient reference state at Ta
Thermal energy (heat) q = T a T C v d T System’s thermal energy with respect to ARS. Equals system’s energy loss as it irreversibly cools to the ambient temperature
Exergy X ≡ Xm+Xq System’s reversible work potential on the ambient surroundings
Thermal exergy Xq=q(T-Ta)/T Reversible work potential by thermal energy on the ARS
Mechanical exergy Xm≡X−Xq=wa System’s work potential on the ARS after cooling to Ta
Entropic energy Q≡E−X=qa Randomized thermal energy at Ta with zero work potential.
Thermocontextual
Entropy
S T a T d q T = Q T a dS=dq/T = thermodynamic entropy
Thermal entropy σq=S/kB Dimensionless thermodynamic entropy
Macrostate Model and Transactional Properties: Based on fixed reference observer and temperature
Fixed Reference Observer Has complete and fixed information on thermocontextual microstate with respect to its ambient temperature at time zero.
Fixed Reference Temperature Tref Set to system’s ambient temperature at time zero. Reference temperature for reversible measurement of energy state.
Configurational Entropy σ c = i = 1 N P i l n ( P i ) Expresses a reference observer’s expectation values for a system’s microstate configurations. The sum is over the N available microstates, objectively defined by a perfect observer. The Bayesian Pi‘s define the reference observer’s macrostate model for describing the system. A low entropy means high precision for the reference observer’s macrostate model.
DKL Divergence (information gap) D K L ( P 1 | P 2 = i = 1 N l n 1 P 2 , a Expresses the statistical separation between a reference observer’s macrostate model, with Bayesian probability distribution P2, and an unknown physical state, described by frequentist probability distribution P1. The physical state’s actual microstate configuration is ‘a’, with probability P1a=1, and the macrostate model’s Bayesian expectation of microstate ‘a’ is P2a. A high P2a and low DKL means high accuracy.
Macrostate Model [P1,P2,…PN] Complete description of physical state as it existed at time zero. At later times, statistically described by Bayesian probabilities.
Configurational energy C = k B T r e f D K L + T a T r e f T T a T C v d T Exergy that is not accessible for work by the reference observer, due to incomplete information (DKL>0) for the microstate and Tref>Ta.
Accessibility A=X−C Energy accessible for work measured at Tref and based on reference observer’s information.
Reference heat q r e f   o r   q ^ r e f Heat discharged per transition to Tref.
Reference work w r e f   o r   w ^ r e f Reversible output of work (and exergy) per transition to Tref
Utilization υ ^ w ^ r e f + A ^ i n t Reversible per-transition output of external work (and accessible energy) to the fixed reference plus the internal work of increasing the system’s accessibility.
Other transactional properties Q ^ , σ ^ q , D ^ K L , X ˇ , A ˇ Per-transition increases in entropic energy, configurational entropy, and DKL information gap; and decreases in exergy and accessibility

References

  1. Vopson, M.; Lepadatu, S. Second law of information dynamics. AIP Advances 2022, 12, 075310. [Google Scholar] [CrossRef]
  2. Shannon, C. A mathematical theory of communication, in The Bell System Technical Journal 1948, 27, 379–423. [CrossRef]
  3. Wikipedia Entropy (information theory). Available online: https://en.wikipedia.org/wiki/Entropy_(information_theory) (accessed on 10 August 2024).
  4. Jaynes, E.T. Information Theory and Statistical Mechanics. The Physical Review 1957, 106, 620–630. [Google Scholar] [CrossRef]
  5. Wikipedia Boltzmann Distribution. 2024. Available online: https://en.wikipedia.org/wiki/Boltzmann_distribution (accessed on 10 August 2024).
  6. Crecraft, H. Time and Causality: A Thermocontextual Perspective. Entropy 2021, 23, 1705. [Google Scholar] [CrossRef] [PubMed]
  7. Dias, T.C.M.; Diniz, M.A.; Pereira, C.A.d.B.; Polpo, A. Overview of the 37th MaxEnt. Entropy 2018, 20, 694. [Google Scholar] [CrossRef]
  8. D’Ariano, G.M.; Paris, G.A.; Sacchi, M.F. Quantum Tomography arXiv 2008, arXiv:quant-ph/0302028v1.
  9. Wikipedia Interpretations of quantum mechanics. Available online: https://en.wikipedia.org/wiki/Interpretations_of_quantum_mechanics (accessed on 10 July 2024).
  10. Takahiro, S. Thermodynamic and logical reversibilities revisited, Jour. Statistical Mechanics: Theory and Experiment, 2014 2014: 03025. [CrossRef]
  11. Wikipedia Bell Test. Available online: https://en.wikipedia.org/wiki/Bell_test (accessed on 10 August 2024).
  12. Crecraft, H. Dissipation + Utilization = Self-Organization. Entropy 2023, 25, 229. [Google Scholar] [CrossRef]
  13. Wikipedia Bayesian Statistics. Available online: https://en.wikipedia.org/wiki/Bayesian_statistics (accessed on 10 August 2024).
  14. Wikipedia Frequentist Probability. Available online: https://en.wikipedia.org/wiki/Frequentist_probability (accessed on 10 August 2024).
  15. Wikipedia Kullback–Leibler divergence. Available online: https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence (accessed on 10 August 2024).
  16. Ribó, J. and Hochberg, D., Stoichiometric network analysis of entropy production in chemical reactions. Phys. Chem. Chem. Phys. 2018, 20. [Google Scholar] [CrossRef]
  17. Ribó, J. and Hochberg, D., Physical Chemistry Models for Chemical Research in the XXth and XXIst Centuries. ACS Physical Chemistry Au 2024 4, 122–134. [CrossRef]
  18. Niven, R.K.; Abel, M.; Schlegel, M.; Waldrip, S.H. Maximum Entropy Analysis of Flow Networks: Theoretical Foundation and Applications. Entropy 2019, 21. [Google Scholar] [CrossRef] [PubMed]
  19. Niven, R.K. , Schlegel, M., Abel, M., Waldrip, S.H., Guimera, R. Entropy Analysis of Flow Networks with Structural Uncertainty (Graph Ensembles). In: Polpo, A., Stern, J., Louzada, F., Izbicki, R., Takada, H. (eds) Bayesian Inference and Maximum Entropy Methods in Science and Engineering. maxent 2017. Springer Proceedings in Mathematics & Statistics, 2018. 239. [CrossRef]
  20. Albert Einstein, Autobiographical Notes, Open Court Publishing Co ,U.S. 1999; ISBN 10: 0812691792.
  21. Martyushev, L. Maximum entropy production principle: History and current status. Phys. Uspekhi. 2021, 64, 558. [Google Scholar] [CrossRef]
  22. Paltridge, G.W. Stumbling into the MEP Racket: An Historical Perspective. In Non-Equilibrium Thermodynamics and the Production of Entropy—Life, Earth, and Beyond; Kleidon, A., Lorenz, R.D., Eds.; Springer: Berlin/Heidelberg, Germany, 2005; 260p, ISBN 3-540-22495-5. [Google Scholar]
  23. Paltridge, G.W. A Story and a Recommendation about the Principle of Maximum Entropy Production. Entropy 2009, 11, 945–948. [Google Scholar] [CrossRef]
  24. Wikipedia Baryogenesis. Available online: https://en.wikipedia.org/wiki/Baryogenesis (accessed on 10 August 2024).
  25. van de Schoot, R. Depaoli, S., King, R. et al. Bayesian statistics and modelling. Nat Rev Methods Primers. 2021. [Google Scholar] [CrossRef]
  26. Modeling Galaxy formation: Verdoolaege, G. Regression of Fluctuating System Properties: Baryonic Tully–Fisher Scaling in Disk Galaxies. In: Polpo, A., Stern, J., Louzada, F., Izbicki, R., Takada, H. (eds) Bayesian Inference and Maximum Entropy Methods in Science and Engineering. maxent 2017. Springer Proceedings in Mathematics & Statistics, 2018 239. [CrossRef]
  27. Rapid medical diagnostics: Ranftl, S.; Melito, G.M.; Badeli, V.; Reinbacher-Köstinger, A.; Ellermann, K.; von der Linden, W. Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection. Entropy 2020, 22. [Google Scholar] [CrossRef]
  28. Rapid medical diagnostics: Makaremi, M.; Lacaule, C.; Mohammad-Djafari, A. Deep Learning and Artificial Intelligence for the Determination of the Cervical Vertebra Maturation Degree from Lateral Radiography. Entropy 2019, 21. [Google Scholar]
  29. Thermodynamic computing: Fry, R.L. Physical Intelligence and Thermodynamic Computing. Entropy 2017, 19, 107. [Google Scholar] [CrossRef]
  30. Thermodynamic computing: Hylton, T. Thermodynamic Computing: An Intellectual and Technological Frontier. Proceedings 2020, 47, 23. [Google Scholar] [CrossRef]
  31. AI and ML: Enßlin, T. Information Field Theory and Artificial Intelligence. Entropy 2022, 24, 374. [Google Scholar] [CrossRef] [PubMed]
  32. Machine learning: Mohammad-Djafari, A. Regularization, Bayesian Inference, and Machine Learning Methods for Inverse Problems. Entropy 2021, 23, 1673. [Google Scholar] [CrossRef]
  33. Ecology: Albert, C.G.; Callies, U.; von Toussaint, U. A Bayesian Approach to the Estimation of Parameters and Their Interdependencies in Environmental Modeling. Entropy 2022, 24, 231. [Google Scholar] [CrossRef] [PubMed]
  34. MaxEnt applications to macroeconomics: Hubert, P. , Stern, J.M. Probabilistic Equilibrium: A Review on the Application of MAXENT to Macroeconomic Models. In: Polpo, A., Stern, J., Louzada, F., Izbicki, R., Takada, H. (eds) Bayesian Inference and Maximum Entropy Methods in Science and Engineering. maxent 2017. Springer Proceedings in Mathematics & Statistics, 2018. 239. [CrossRef]
  35. Optimizing imaging: Spector-Zabusky, A. , Spector, D. Schrödinger’s Zebra: Applying Mutual Information Maximization to Graphical Halftoning. In: Polpo, A., Stern, J., Louzada, F., Izbicki, R., Takada, H. (eds) Bayesian Inference and Maximum Entropy Methods in Science and Engineering. maxent 2017. Springer Proceedings in Mathematics & Statistics, 2018. 239. [CrossRef]
  36. Enhancing MRI: Earle, K.A. , Broderick, T., Kazakov, O Effect of Hindered Diffusion on the Parameter Sensitivity of Magnetic Resonance Spectra. In: Polpo, A., Stern, J., Louzada, F., Izbicki, R., Takada, H. (eds) Bayesian Inference and Maximum Entropy Methods in Science and Engineering. maxent 2017. Springer Proceedings in Mathematics & Statistics, 2018. 239. [CrossRef]
  37. Image reconstruction: Denisova, N. Bayesian Maximum-A-Posteriori Approach with Global and Local Regularization to Image Reconstruction Problem in Medical Emission Tomography. Entropy 2019, 21, 1108. [Google Scholar] [CrossRef]
  38. Caticha, N. Entropic Dynamics in Neural Networks, the Renormalization Group and the Hamilton-Jacobi-Bellman Equation. Entropy 2020, 22. [Google Scholar] [CrossRef]
  39. Optimizing fusion realtors: Nille, D. , von Toussaint, U., Sieglin, B., Faitsch, M. Probabilistic Inference of Surface Heat Flux Densities from Infrared Thermography. In: Polpo, A., Stern, J., Louzada, F., Izbicki, R., Takada, H. (eds) Bayesian Inference and Maximum Entropy Methods in Science and Engineering. maxent 2017. Springer Proceedings in Mathematics & Statistics, 2018. 239. [CrossRef]
  40. Wikipedia Double-slit experiment. Available online: https://en.wikipedia.org/wiki/Double-slit_experiment (accessed on 10 August 2024).
  41. Feynman, R. Leighton, R., and Sands, M. The Feynman Lectures on Physics Vol. I: Mainly Mechanics, Radiation, and Heat, Millennium Edition Chapter 37 (Quantum Behavior). California Institute of Technology, 2013. Available online: https://www.feynmanlectures.caltech.edu/I_37.html.
  42. Crecraft, H. MaxEnt: Selection at the Heart of Quantum Mechanics. Preprints 2022. [CrossRef]
  43. University Physics III - Optics and Modern Physics (OpenStax) Chapter 3: Interference. Available online: https://phys.libretexts.org/Bookshelves/University_Physics/Book%3A_University_Physics_(OpenStax)/University_Physics_III_-_Optics_and_Modern_Physics_(OpenStax)/03%3A_Interference (accessed on 4 June 2022).
  44. F Miller, S.L. Production of Amino Acids Under Possible Primitive Earth Conditions. 1953. [Google Scholar]
  45. Michaelian, K. The Dissipative Photochemical Origin of Life: UVC Abiogenesis of Adenine. Entropy 2021, 23, 217. [Google Scholar] [CrossRef] [PubMed]
  46. Athel Cornish-Bowden, María Luz Cárdenas, Contrasting theories of life: Historical context, current theories. In search of an ideal theory, Biosystems 2020, 188. [CrossRef]
Figure 1. Perfect Measurement of Exergy and Entropic Energy. As the system reversibly transitions to its equilibrium ambient state, exergy and entropic energy are output and recorded by a classical measurement device as exchanges of work wa and ambient heat qa at the ambient temperature.
Figure 1. Perfect Measurement of Exergy and Entropic Energy. As the system reversibly transitions to its equilibrium ambient state, exergy and entropic energy are output and recorded by a classical measurement device as exchanges of work wa and ambient heat qa at the ambient temperature.
Preprints 121878 g001
Figure 2. TCI energy states. Each system is prepared by applying work wa to an ambient gas with zero stored exergy and E=X=Q=0. (a) The thermal energy state is prepared by the work of reversibly pumping heat from the ambient surroundings into the gas. (b) The mechanical energy state is prepared by applying work to the mechanical battery only. (c) The configurational energy state is prepared by the work of isothermally compressing the gas. (d) Measurement devices record the exergy and entropic energy changes during reversible transitions back to the ARS. (e) The ARS defines the zero-energy levels for exergy and entropic energy. All transitions are with respect to fixed ambient temperature and pressure.
Figure 2. TCI energy states. Each system is prepared by applying work wa to an ambient gas with zero stored exergy and E=X=Q=0. (a) The thermal energy state is prepared by the work of reversibly pumping heat from the ambient surroundings into the gas. (b) The mechanical energy state is prepared by applying work to the mechanical battery only. (c) The configurational energy state is prepared by the work of isothermally compressing the gas. (d) Measurement devices record the exergy and entropic energy changes during reversible transitions back to the ARS. (e) The ARS defines the zero-energy levels for exergy and entropic energy. All transitions are with respect to fixed ambient temperature and pressure.
Preprints 121878 g002
Figure 4. Thermocontextual State Description and Macrostate model. The thermocontextual state is completely defined by its energy-state measurement with respect to the ambient reference and by its microstate description by a perfect observer. The macrostate model, in contrast, is defined with respect to a reference at a fixed reference temperature, Tref, and by a generally imperfect observer.
Figure 4. Thermocontextual State Description and Macrostate model. The thermocontextual state is completely defined by its energy-state measurement with respect to the ambient reference and by its microstate description by a perfect observer. The macrostate model, in contrast, is defined with respect to a reference at a fixed reference temperature, Tref, and by a generally imperfect observer.
Preprints 121878 g004
Figure 5. Accessibility and Configurational Energy. Configurational energy C is unavailable to the reference observer for work, due to Tref>Ta or due to incomplete microstate information. Accessibility A is the balance of exergy X that is reversibly available to the fixed reference observer for work.
Figure 5. Accessibility and Configurational Energy. Configurational energy C is unavailable to the reference observer for work, due to Tref>Ta or due to incomplete microstate information. Accessibility A is the balance of exergy X that is reversibly available to the fixed reference observer for work.
Preprints 121878 g005
Figure 6. Quasistatic Transition Types. The figure illustrates the three types of transitions from ambient gas to compressed gas by the application of work, wa. The transitions are quasistatic, with no frictional losses. The equilibrium transitions, L and R, reversibly and deterministically compress the gas from either the left or right side and produce a definite and known microstate, L or R. Transition M is reversible but statistical, and it produces a mixed macrostate MLR, with a single definite but unknown microstate L or R. Transition Q irreversibly but deterministically compresses the gas and produces a single thermalized microstate, QLR. The macrostate model and transactional properties for each are shown in the tables on the right side.
Figure 6. Quasistatic Transition Types. The figure illustrates the three types of transitions from ambient gas to compressed gas by the application of work, wa. The transitions are quasistatic, with no frictional losses. The equilibrium transitions, L and R, reversibly and deterministically compress the gas from either the left or right side and produce a definite and known microstate, L or R. Transition M is reversible but statistical, and it produces a mixed macrostate MLR, with a single definite but unknown microstate L or R. Transition Q irreversibly but deterministically compresses the gas and produces a single thermalized microstate, QLR. The macrostate model and transactional properties for each are shown in the tables on the right side.
Preprints 121878 g006
Figure 7. Dissipative network diagram for the reactions A+X→B+Y and Y→X. States are represented as horizontal lines with specific exergy per unit of component. Transitions between states are represented by dotted lines and numbered nodes. Transition 1 extracts exergy from input A as it transitions to output B. Exergy is transferred to the coupled transition node 2, which does internal work of converting X to the higher-exergy state Y. Transition 3 takes state Y back to X and uses the extracted exergy for external work on the surroundings. .
Figure 7. Dissipative network diagram for the reactions A+X→B+Y and Y→X. States are represented as horizontal lines with specific exergy per unit of component. Transitions between states are represented by dotted lines and numbered nodes. Transition 1 extracts exergy from input A as it transitions to output B. Exergy is transferred to the coupled transition node 2, which does internal work of converting X to the higher-exergy state Y. Transition 3 takes state Y back to X and uses the extracted exergy for external work on the surroundings. .
Preprints 121878 g007
Figure 8. Stationary dissipative processes and homeostates. Each path represents a homeostate and process of transitioning source components to the ambient surroundings. Observation reduces the observer’s information gap by revealing information on the network nodes, transitions, and the internal work of increasing the system’s accessibility. An external device measures the outputs of work (and accessible energy) and heat to the fixed reference. Perfect observation and measurement are in the quasistatic limit of zero frictional losses of exergy and information.
Figure 8. Stationary dissipative processes and homeostates. Each path represents a homeostate and process of transitioning source components to the ambient surroundings. Observation reduces the observer’s information gap by revealing information on the network nodes, transitions, and the internal work of increasing the system’s accessibility. An external device measures the outputs of work (and accessible energy) and heat to the fixed reference. Perfect observation and measurement are in the quasistatic limit of zero frictional losses of exergy and information.
Preprints 121878 g008
Figure 9. Double slit experiment. The microstate configurations are defined by the resolution of the detector screen. If the which-switch detector (WSD) is disabled, an individual transition passes through the double slits symmetrically and randomly instantiates an impact and definite configuration on the detector screen (state B). Multiple transitions generate a statistically mixed macrostate C, represented by a probability distribution of instantiated microstates. With the WSD activated, the particle passes asymmetrically through one slit or the other, and the interference pattern for C changes to a single broad peak.
Figure 9. Double slit experiment. The microstate configurations are defined by the resolution of the detector screen. If the which-switch detector (WSD) is disabled, an individual transition passes through the double slits symmetrically and randomly instantiates an impact and definite configuration on the detector screen (state B). Multiple transitions generate a statistically mixed macrostate C, represented by a probability distribution of instantiated microstates. With the WSD activated, the particle passes asymmetrically through one slit or the other, and the interference pattern for C changes to a single broad peak.
Preprints 121878 g009
Figure 10. Probability distribution profiles for particle detection from double slits, with and without wave interference and from single slits.
Figure 10. Probability distribution profiles for particle detection from double slits, with and without wave interference and from single slits.
Preprints 121878 g010
Figure 11. The work of adding an A or B to array is equal to Δw.
Figure 11. The work of adding an A or B to array is equal to Δw.
Preprints 121878 g011
Table 1. Energy State Classifications.
Table 1. Energy State Classifications.
Energy State Classification Energy components
Thermal state energy (Q>0): Eq = (X,Q); Eq=X+Q >X
Mechanical state energy (Q=0): Em = (X,0); Em=X
Configurational state energy (Q<0): Ec = (X,Q); Ec=X+Q <X.
Table 2. Double slit interference and diffraction parameters.
Table 2. Double slit interference and diffraction parameters.
Detector width Slit width Slit
positions
Slit-detector separation Observer’s
Resolution
200 λ 7 λ ±15 λ 300 λ 0.5 λ
Table 3. Configurational Entropies of Transitions in Double Slit Experiment.
Table 3. Configurational Entropies of Transitions in Double Slit Experiment.
Entropy Transition (normalized probability distribution)
1 4.69 No WSD—source to detector (red profile)
2 0.69 WSD on--source to one of the slits (50% - 50%)
3 5.02 WSD on—slit to detector (green or blue profile)
4 5.71 WSD on—overall: source to detector (purple profile)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated