Preprint
Article

This version is not peer-reviewed.

Information Exchange Fluctuation Theorem Under Coarse-Graining

A peer-reviewed article of this preprint also exists.

Submitted:

20 July 2025

Posted:

21 July 2025

You are already at the latest version

Abstract
The fluctuation theorem for information exchange, originally established by Sagawa and Ueda [Phys. Rev. Lett. 109, 180602 (2012)], provides a fundamental framework for understanding the role of correlations in coupled classical stochastic systems. Building upon this foundation, Jinwoo [Entropy 21, 477 (2019)] demonstrated that the pointwise mutual information between correlated subsystems captures entropy production as a state function during coupling processes. In this study, we investigate the robustness of this information-theoretic fluctuation theorem under coarse-graining in coupled classical fluctuating systems. We rigorously prove that the fluctuation theorem remains invariant under arbitrary coarse-graining transformations and derive hierarchical relationships between information measures across different scales, thereby establishing its fundamental character as independent of the level of system description. Our results demonstrate that the relationship between information exchange and entropy production is preserved across different scales of observation, providing deeper insights into the thermodynamic foundations of information processing in classical stochastic systems.
Keywords: 
;  ;  ;  ;  ;  ;  

1. Introduction

Stochastic thermal fluctuations constitute a cornerstone in the operation of molecular machinery and systems operating far from thermodynamic equilibrium. These fluctuations facilitate energy transfer between molecular components and their surroundings, permitting molecules to surmount energetic barriers and achieve stable low-energy configurations. The inherent randomness of molecular motion transforms thermodynamic quantities such as heat and work into stochastic variables. During the past twenty years, fluctuation theorems have emerged as a powerful class of exact relations revealing universal principles that constrain these random thermodynamic quantities in processes driving systems away from equilibrium.
The theoretical foundation was established through seminal contributions including the Evans-Searles transient fluctuation theorem (1994), followed by the groundbreaking Jarzynski equality [1] (1997) and the Crooks fluctuation theorem [2] (1998), which spawned numerous theoretical extensions and generalizations. Seifert’s formulation extended thermodynamic laws to individual stochastic trajectories [3], while Hatano and Sasa developed frameworks for transitions between non-equilibrium steady states [4]. Jinwoo and Tanaka [5,6] demonstrated that fluctuation theorems hold even within an ensemble of trajectories conditioned at a final (micro/meso) state, establishing that local free energy captures fluctuating work within the path ensemble reaching a specific (micro/meso) state during an externally driven non-equilibrium process. Single-molecule experimental studies have provided rigorous validation of these theoretical predictions, yielding valuable insights into biomolecular dynamics [7,8,9,10,11,12].
Contemporary research in fluctuation theorems for information exchange within coupled classical stochastic systems has established a comprehensive theoretical framework linking information theory with non-equilibrium statistical mechanics [13,14,15,16,17]. This rapidly advancing field, built upon the pioneering contributions of Sagawa and Ueda, has witnessed substantial theoretical developments and experimental validations in recent years [18,19]. The discipline investigates fundamental principles governing the energetic requirements of information manipulation, the influence of statistical correlations on entropy generation, and the persistence of thermodynamic laws across different observational scales.
This theoretical framework has become particularly relevant for understanding biological systems, as living organisms have evolved sophisticated information processing architectures essential for their survival and reproduction [20,21,22]. These biological systems demonstrate remarkable capabilities in detecting environmental chemical signals [23,24], propagating information across complex signaling pathways [25,26,27], and regulating genetic expression through molecular communication networks [28,29]. Cellular systems have developed the ability to perform temporal integration by encoding environmental conditions into internal molecular configurations, thereby minimizing sensory uncertainties [30,31]. Understanding the thermodynamic implications of information processing is therefore essential for comprehending the intricate mechanisms underlying biological computation and how fluctuation theorems govern the fundamental limits of these processes.
Sagawa and Ueda incorporated informational considerations into fluctuation theorem formalism [15]. Their work established a fluctuation theorem for information exchange processes, creating a unified description of measurement and feedback control phenomena in non-equilibrium settings [17]. Their theoretical framework analyzed scenarios where system X undergoes evolution dependent on the instantaneous state y of system Y, under the assumption that Y remains stationary during X’s temporal evolution. They rigorously demonstrated that correlation establishment between subsystems inevitably generates entropy production, providing definitive resolution to Maxwell’s demon paradox and establishing mathematical foundations for understanding energy-information conversion mechanisms.
Subsequent theoretical advances have extended these foundational results. Particularly significant is Jinwoo’s work that eliminated the static constraint imposed by Sagawa and Ueda, demonstrating that identical fluctuation theorem forms remain valid when both subsystems X and Y undergo simultaneous dynamic evolution [32]. Jinwoo also demonstrated that pointwise mutual information between a coupled state captures the entropy production within the ensemble of paths that reach the coupled state [33].
The present investigation examines the stability of information-theoretic fluctuation theorems under coarse-graining in coupled classical stochastic systems. Establishing how these fundamental relationships maintain their validity across multiple observational scales is critical for demonstrating the universal nature of information-thermodynamic principles and their applicability to practical systems where complete microscopic details may be experimentally inaccessible or computationally intractable.
In this study, we investigate the robustness of information-theoretic fluctuation theorems under coarse-graining in coupled classical stochastic systems. Building upon the foundational framework established by Sagawa and Ueda [17] and the subsequent generalization by Jinwoo [32,33], our research addresses a fundamental question: do the relationships between information exchange and entropy production remain invariant when the system description is coarse-grained to different levels of detail?
The motivation for this investigation stems from the practical reality that physical systems are typically observed and described at various scales, from microscopic molecular dynamics to mesoscopic collective behaviors. Understanding whether fluctuation theorems maintain their validity across these different levels of description has profound implications for both theoretical understanding and practical applications. If these theorems are indeed robust under coarse-graining, it would establish their fundamental character as scale-independent principles governing information processing in thermodynamic systems.
Our approach rigorously proves that the fluctuation theorem for information exchange remains invariant under coarse-graining operations, thereby establishing its fundamental character as independent of the level of system description. The results demonstrate that the relationship between information exchange and entropy production is preserved across different scales of observation, providing deeper insights into the thermodynamic foundations of information processing in classical stochastic systems.

2. Information Exchange Fluctuation Theorem

We examine a system weakly coupled to a thermal reservoir characterized by inverse temperature β : = 1 / ( k B T ) , where k B denotes the Boltzmann constant and T represents the reservoir temperature. The external protocol μ t perturbs the system from its equilibrium state throughout the time interval 0 t τ . We denote by T the complete ensemble of microscopic trajectories, while T x τ represents the subset of trajectories that terminate at the specific microstate x τ at the final time τ .
We now examine the Sagawa-Ueda fluctuation theorem for information exchange [17], focusing particularly on its generalized formulation [32]. For this analysis, we study two subsystems X and Y immersed in a thermal reservoir characterized by inverse temperature β . Throughout the protocol μ t , these subsystems undergo mutual interaction and coupled evolution. Under these conditions, the information exchange fluctuation theorem takes the form:
e Π + Δ M T = 1 ,
where the angular brackets denote ensemble averaging across all trajectories T of the coupled system, Π represents the total entropy production encompassing contributions from system X, system Y, and the thermal bath, while Δ M quantifies the variation in mutual information linking X and Y. It is worth noting that the original Sagawa-Ueda formulation restricts the thermal contact to system X alone, with system Y remaining stationary throughout the evolution [17,32]. A trajectory-conditioned variant of Equation (1) reads:
I τ ( x τ , y τ ) = ln e ( Π + I 0 ) x τ , y τ ,
where the averaging is performed over the restricted ensemble of trajectories terminating at the coupled state ( x τ , y τ ) at the final time τ , and I t (for 0 t τ ) denotes the pointwise mutual information between the microstates of systems X and Y at time t [33]. When initial correlations are absent, i.e., I 0 = 0 , relation Equation (2) demonstrates that the pointwise mutual information I τ , viewed as a state function of the coupled microstates ( x τ , y τ ) , captures the entropy production Π within the trajectory ensemble constrained to reach these final states. By analogy, the initial correlation I 0 can be understood as representing the entropy production associated with establishing the initial correlated state.
In this study, we consider coarse-graining transformations a and b, which are functions of microstates x and y, respectively. The functions a and b can be arbitrary generalized coordinates, for example,
(i) Spatial binning: a ( x ) = bin ( x ) , where molecular positions x are discretized into spatial bins of a certain width, commonly used in analyzing diffusion processes and spatial correlations in biomolecular systems;
(ii) Energy-based grouping: a ( x ) = E ( x ) where microstates x are classified according to their energy levels E, particularly relevant for studying conformational transitions in proteins and folding dynamics;
(iii) Reaction coordinate projection: a ( x ) = f ( x ) where f represents a collective variable such as end-to-end distance in polymers, radius of gyration in protein folding, or dihedral angles in molecular conformational changes;
(iv) Cluster-based coarse-graining: a ( x ) = c where c denotes the cluster index obtained from unsupervised clustering algorithms applied to molecular configurations, widely employed in Markov state model construction for biomolecular dynamics;
(v) Order parameter discretization: a ( x ) = sign ( ψ ( x ) ) where ψ ( x ) is an order parameter such as local crystallinity in phase transitions or helical content in protein secondary structure formation.
These transformations effectively reduce the dimensionality of the state space while preserving essential thermodynamic information. We establish that such transformations maintain the information exchange thermodynamic behavior across different scales of observation by proving that
I τ ( a , b ) = ln e ( Π + I 0 ) a , b
where the angular brackets denote the ensemble average over all conditioned trajectories that terminate at coupled microstates ( x τ , y τ ) satisfying a ( x τ ) = a and b ( y τ ) = b . In the course of establishing Equation (3), we simultaneously derive
e I τ ( x τ , y τ ) a ( x τ ) = a , b ( y τ ) = b = e I τ ( a , b ) ,
where the angular brackets represent the ensemble average taken over the conditional probability distribution P τ ( x τ , y τ | a , b ) . Equation (4) constitutes a fluctuation relation that reveals the hierarchical organization of pointwise mutual information across different observational scales.
Moreover, we prove the work fluctuation theorem for information exchange as follows:
e β W a , b = e β Δ Φ X ( a , τ ) β Δ Φ Y ( b , τ ) I τ ( a , b ) ,
where Δ Φ X ( a , τ ) = Φ X ( a , τ ) F eq ( μ 0 ) is the difference between the non-equilibrium free energy Φ X ( a , τ ) and the equilibrium free energy F eq ( μ 0 ) for system X, and Δ Φ Y is the corresponding quantity for system Y. Equation (5) holds provided that the initial condition corresponds to the equilibrium distribution.

3. Results

3.1. Theoretical Framework

Consider two finite classical stochastic systems X and Y that are weakly coupled to a thermal reservoir characterized by inverse temperature β . Throughout the time interval 0 t τ , an external driving parameter μ t may force either or both subsystems into non-equilibrium states [34,35,36]. The temporal evolution of systems X and Y under the influence of the time-dependent protocol μ t is governed by classical stochastic dynamics, resulting in stochastic trajectories { x t } and { y t } , where x t (respectively y t ) represents a particular microstate of system X (respectively Y) at time t within the interval 0 t τ for any given realization.
Due to the inherent stochasticity of the trajectories, we perform multiple realizations of the driving protocol μ t starting from an initial joint probability distribution P 0 ( x , y ) defined over the complete set of microstates ( x , y ) for the combined system. Through this ensemble of realizations, the coupled subsystems evolve to produce a time-dependent joint probability distribution P t ( x , y ) throughout the duration 0 t τ . The associated marginal probability distributions are obtained as P t ( x ) : = P t ( x , y ) d y and P t ( y ) : = P t ( x , y ) d x .
We assume
P 0 ( x , y ) 0 for   all ( x , y )
ensuring that P t ( x , y ) 0 , P t ( x ) 0 , and P t ( y ) 0 hold for all microstates x and y throughout the interval 0 t τ .
The entropy production Π generated during the protocol μ t over the time period 0 t τ is defined as
Π : = Δ ξ X Y + β H ,
in which Δ ξ X Y represents the stochastic entropy changes along trajectories { x t } and { y t } , while H denotes the heat transferred to the thermal bath (corresponding to entropy generation in the reservoir) [2,3]. Explicitly, these quantities are expressed as
Δ ξ X Y : = Δ ξ X + Δ ξ Y , Δ ξ X : = ln P τ ( x τ ) + ln P 0 ( x 0 ) , Δ ξ Y : = ln P τ ( y τ ) + ln P 0 ( y 0 ) .
It is important to recognize that the stochastic entropy ξ [ P t ( ) ] : = ln P t ( ) associated with microstate ∘ at time t quantifies the uncertainty in the occurrence of ∘ at that instant: higher uncertainty in microstate realization corresponds to larger stochastic entropy values. It should be emphasized that the framework in [17] considers only system X to be thermally coupled to the heat reservoir, while system Y remains isolated and stationary. Consequently, their entropy production expression becomes Π su : = Δ ξ X + β H .
We consider that throughout the temporal evolution governed by process μ t , subsystem X engages in information transfer with subsystem Y. This implies that the dynamical path { x t } followed by system X is influenced by the corresponding path { y t } taken by system Y. The pointwise mutual information I t at temporal instance t characterizing the correlation between microstates x t and y t quantifies how much the uncertainty in x t decreases when y t is known [17]:
I t ( x t , y t ) : = ξ [ P t ( x t ) ] ξ [ P t ( x t | y t ) ] = ln P t ( x t , y t ) P t ( x t ) P t ( y t ) ,
where P t ( x t | y t ) represents the conditional probability density of microstate x t conditioned on y t . As the degree of statistical dependence between microstates x t and y t increases, the magnitude of I t ( x t , y t ) correspondingly grows larger.
It is important to observe that when x t and y t exhibit statistical independence at time t, the quantity I t ( x t , y t ) vanishes. The expectation value of I t ( x t , y t ) computed with the joint probability distribution P t ( x t , y t ) across the entire microstate space yields the standard mutual information between the coupled subsystems, which maintains non-negativity [37].

3.2. Proof of Equations (3) and (4)

Now we are ready to prove the information exchange fluctuation theorem conditioned on a coupled-state ( a , b ) . We define reverse process μ t : = μ τ t for 0 t τ , where the external parameter is time-reversed [13,14]. The initial probability distribution P 0 ( x , y ) for the reverse process should be the final probability distribution for the forward process P τ ( x , y ) so that we have
P 0 ( x ) = P 0 ( x , y ) d y = P τ ( x , y ) d y = P τ ( x ) , P 0 ( y ) = P 0 ( x , y ) d x = P τ ( x , y ) d x = P τ ( y ) .
Then, by Equation (6), we have P t ( x , y ) 0 , P t ( x ) 0 , and P t ( y ) 0 for all x and y during 0 t τ . For each trajectories { x t } and { y t } for 0 t τ , we define the time-reversed conjugate as follows:
{ x t } : = { x τ t * } , { y t } : = { y τ t * } ,
where ∗ denotes momentum reversal. Let T be the set of all trajectories { x t } and { y t } , and T x τ , y τ be that of trajectories conditioned at coupled-microstates ( x τ , y τ ) at time τ . By the definition of time-reversed conjugate, the set T of all time-reversed trajectories is identical to T , and the set T x 0 , y 0 of time-reversed trajectories conditioned at x 0 and y 0 is identical to T x τ , y τ . Thus we may use the same notation for both forward and backward pairs. We note that the path probabilities P T and P T x τ , y τ are normalized over all paths in T and T x τ , y τ , respectively. With this notation, the microscopic reversibility condition that enables us to connect the probability of forward and reverse paths to dissipated heat reads as follows [2,38,39,40]:
P T ( { x t } , { y t } | x 0 , y 0 ) P T ( { x t } , { y t } | x 0 , y 0 ) = e β H ,
where P T ( { x t } , { y t } | x 0 , y 0 ) is the conditional joint probability distribution of paths { x t } and { y t } conditioned at initial microstates x 0 and y 0 , and P T ( { x t } , { y t } | x 0 , y 0 ) is that for the reverse process. Now we consider the ensemble of paths T a , b and normalized path probability P T a , b such that
T a , b = a ( x τ ) = a , b ( y τ ) = b T x τ , y τ and T a , b P T a , b { x t } , { y t } d { x t } d { y t } = 1 .
We restrict our attention to those paths that are in T a , b , and divide both numerator and denominator of the left-hand side of Equation (12) by P τ ( a , b ) . Since P τ ( a , b ) is identical to P 0 ( a , b ) , Equation (12) becomes as follows:
P T a , b ( { x t } , { y t } | x 0 , y 0 ) P T a , b ( { x t } , { y t } | x 0 , y 0 ) = e β H
since the probability of paths is now normalized over T a , b . Then we have the following:
P T a , b ( { x t } , { y t } ) P T a , b ( { x t } , { y t } ) = P T a , b ( { x t } , { y t } | x 0 , y 0 ) P T a , b ( { x t } , { y t } | x 0 , y 0 ) · P 0 ( x 0 , y 0 ) P 0 ( x 0 , y 0 )
= P T a , b ( { x t } , { y t } | x 0 , y 0 ) P T a , b ( { x t } , { y t } | x 0 , y 0 ) · P 0 ( x 0 , y 0 ) P 0 ( x 0 ) P 0 ( y 0 ) · P 0 ( x 0 ) P 0 ( y 0 ) P 0 ( x 0 , y 0 ) × P 0 ( x 0 ) P 0 ( x 0 ) · P 0 ( y 0 ) P 0 ( y 0 )
= exp { β H + I τ ( x τ , y τ ) I 0 ( x 0 , y 0 ) Δ ξ X Δ ξ Y }
= exp { Π + I τ ( x τ , y τ ) I 0 ( x 0 , y 0 ) } .
To obtain Equation (16) from Equation (15), we multiply Equation (15) by P 0 ( x 0 ) P 0 ( y 0 ) P 0 ( x 0 ) P 0 ( y 0 ) and P 0 ( x 0 ) P 0 ( y 0 ) P 0 ( x 0 ) P 0 ( y 0 ) , which are 1. We obtain Equation (17) by applying Equations (8)–(10) and (14) to Equation (16). Finally, we use Equation (7) to obtain Equation (18) from Equation (17).
Now we multiply both sides of Equation (18) by e I τ ( x τ , y τ ) and P T a , b ( { x t } , { y t } ) , and take integral over all paths in T a , b :
e ( Π + I 0 ) a , b : = { x t } , { y t } T a , b e ( Π + I 0 ) P T a , b ( { x t } , { y t } ) d { x t } d { y t } = { x t } , { y t } T a , b e I τ ( x τ , y τ ) P T a , b ( { x t } , { y t } ) d { x t } d { y t } .
Noting that
P τ ( a , b ) P T a , b = P τ ( x τ , y τ ) P T x τ , y τ
we obtain Equation (3) and Equation (4) as follows
e ( Π + I 0 ) a , b = a ( x τ ) = a , b ( y τ ) = b e I τ ( x τ , y τ ) P τ ( x τ , y τ ) P τ ( a , b ) d x τ d y τ × { x t } , { y t } T x τ , y τ P T x τ , y τ ( { x t } , { y t } ) d { x t } d { y t } = a ( x τ ) = a , b ( y τ ) = b e I τ ( x τ , y τ ) P τ ( x τ , y τ ) P τ ( a , b ) d x τ d y τ = a ( x τ ) = a , b ( y τ ) = b P τ ( x τ ) P τ ( y τ ) P τ ( x τ , y τ ) P τ ( x τ , y τ ) P τ ( a , b ) d x τ d y τ = P τ ( a ) P τ ( b ) P τ ( a , b )
= e I τ ( a , b )
Here we use the fact that e I τ ( x τ , y τ ) is constant for all paths in T x τ , y τ , probability distribution P T x τ , y τ is normalized over all paths in T x τ , y τ , and d { x t } = d { x t } and d { y t } = d { y t } by the definition of the time-reversal conjugate.
The information exchange fluctuation theorems demonstrate unambiguously that, analogous to how local free energy captures work [5], the pointwise mutual information between coupled states ( a , b ) captures entropy production within the ensemble of trajectories terminating at each respective state. The subsequent corollary elucidates entropy production in greater detail from an energetic perspective.

3.3. Work Fluctuation Theorem for Information Exchange

Here, we briefly review Jinwoo-Tanaka’s work fluctuation theorem. We assume that the initial probability distribution corresponds to the equilibrium distribution at macrostate μ 0 . The local non-equilibrium free energy ϕ ( x τ , τ ) captures the work W performed on the system within the trajectory ensemble T x τ according to [5,6]:
ϕ ( x τ , τ ) = F e q ( μ 0 ) 1 β ln e β W T x τ ,
where the angular brackets · T x τ denote the average over all trajectories terminating at x τ at time τ , and F e q ( μ 0 ) represents the equilibrium free energy at the control parameter μ 0 . The local non-equilibrium free energy is expressed as
ϕ ( x , t ) : = E ( x ; μ t ) + k B T ln P t ( x ) ,
where E ( x ; μ t ) is the internal energy. Under coarse-graining transformations, the fluctuation theorem preserves its functional form [5,6]:
Φ ( a , τ ) = F e q ( μ 0 ) 1 β ln e β W T a ,
where the angular brackets · T a denote the ensemble average over all trajectories that terminate at microstates x τ satisfying the condition a ( x τ ) = a . The corresponding local non-equilibrium free energy Φ takes the form
Φ ( a , t ) : = G ( a ; μ t ) + k B T ln P t ( a ) ,
where G ( a ; μ t ) : = β 1 ln a ( x ) = a e β E ( x ; μ t ) d x represents the conformational free energy associated with the coarse-grained state a. The multi-scale relationship is captured by the following identity:
e β ϕ ( x , t ) a ( x ) = a = e β Φ ( a , t ) ,
where the angular brackets denote the conditional ensemble average taken with respect to the normalized probability distribution P t ( x , t ) / P t ( a , t ) .
We now prove the information exchange work fluctuation theorem. The local free energies ϕ X and ϕ Y for systems X and Y, respectively, at time t with macrostate μ t are defined as follows:
ϕ X ( x t , t ) : = E X ( x t ; μ t ) k B T ξ X [ P t ( x t ) ] ϕ Y ( y t , t ) : = E Y ( y t ; μ t ) k B T ξ Y [ P t ( y t ) ] ,
where T is the temperature of the heat bath, k B is the Boltzmann constant, E X and E Y are the internal energies of systems X and Y, respectively, and ξ X and ξ Y are the stochastic entropies of X and Y, respectively [2,3]. Work done on either one or both systems through process μ t is expressed by the first law of thermodynamics as follows:
W : = Δ E + H ,
where Δ E is the change in internal energy of the total system composed of X and Y. If we assume that systems X and Y are weakly coupled, in that interaction energy between X and Y is negligible compared to the internal energy of X and Y, we may have
Δ E : = Δ E X + Δ E Y ,
where Δ E X : = E X ( x τ , τ ) E X ( x 0 , 0 ) and Δ E Y : = E Y ( y τ , τ ) E Y ( y 0 , 0 ) [41]. We rewrite Equation (17) by adding and subtracting the change of internal energy Δ E X of X and Δ E Y of Y as follows:
P T a , b ( { x t } , { y t } ) P T a , b ( { x t } , { y t } ) = exp { β ( H + Δ E X + Δ E Y ) + β Δ E X Δ ξ X + β Δ E Y Δ ξ Y } × exp { I τ ( x τ , y τ ) I 0 ( x 0 , y 0 ) }
= exp { β W + β ϕ X ( x τ , τ ) β ϕ X ( x 0 , 0 ) + β ϕ Y ( y τ , τ ) β ϕ Y ( y 0 , 0 ) × exp { I τ ( x τ , y τ ) I 0 ( x 0 , y 0 ) }
where we have applied Equations (28)–(30) consecutively to Equation (31) to obtain Equation (32). Now we multiply both sides of Equation (32) by e β ϕ X ( x τ , τ ) β ϕ Y ( y τ , τ ) I τ ( x τ , y τ ) and P T a , b ( { x t } , { y t } ) , and take integral over all paths in T a , b to obtain the following:
e β W β ϕ X ( x 0 , 0 ) β ϕ Y ( y 0 , 0 ) I 0 a , b : = { x t } , { y t } T x τ , y τ e β W β ϕ X ( x 0 , 0 ) β ϕ Y ( y 0 , 0 ) I 0 P T a , b ( { x t } , { y t } ) d { x t } d { y t } = { x t } , { y t } T x τ , y τ e β ϕ X ( x τ , τ ) β ϕ Y ( y τ , τ ) I τ ( x τ , y τ ) P T a , b ( { x t } , { y t } ) d { x t } d { y t }
Noting that
P τ ( a , b ) P T a , b = P τ ( x τ , y τ ) P T x τ , y τ
we obtain
e β W β ϕ X ( x 0 , 0 ) β ϕ Y ( y 0 , 0 ) I 0 a , b = a ( x τ ) = a , b ( y τ ) = b e β ϕ X ( x τ , τ ) β ϕ Y ( y τ , τ ) I τ ( x τ , y τ ) P τ ( x τ , y τ ) P τ ( a , b ) d x τ d y τ × { x t } , { y t } T x τ , y τ P T x τ , y τ ( { x t } , { y t } ) d { x t } d { y t } = a ( x τ ) = a , b ( y τ ) = b e β ϕ X ( x τ , τ ) β ϕ Y ( y τ , τ ) P τ ( x τ ) P τ ( y τ ) P τ ( x τ , y τ ) P τ ( x τ , y τ ) P τ ( a , b ) d x τ d y τ = a ( x τ ) = a , b ( y τ ) = b e β ϕ X ( x τ , τ ) β ϕ Y ( y τ , τ ) P τ ( x τ ) P τ ( y τ ) P τ ( a ) P τ ( b ) P τ ( a ) P τ ( b ) P τ ( a , b ) d x τ d y τ = e I τ ( a , b ) a ( x τ ) = a e β ϕ X ( x τ , τ ) P τ ( x τ ) P τ ( a ) d x τ b ( y τ ) = b e β ϕ Y ( y τ , τ ) P τ ( y τ ) P τ ( b ) d y τ = e β Φ X ( a , τ ) β Φ Y ( b , τ ) I τ ( a , b ) ,
which generalizes known relations in the literature [15,41,42,43,44,45]. The last equality follows from Equation (27). Under the initial equilibrium condition Equation (34) reduces to Equation (5). We note that Equation (34) holds under the weak-coupling assumption between systems X and Y during process μ t , and Φ X and Φ Y are non-equilibrium free energies, which are different from the equilibrium free energy that appears in similar relations in the literature [15,42,43,44,45].

4. Conclusions

In this study, we have rigorously established that the fluctuation theorem for information exchange remains invariant under coarse-graining transformations in coupled classical stochastic systems. Our theoretical framework demonstrates that the fundamental relationship between information exchange and entropy production is preserved across different scales of observation.
The key findings of our work include: (i) the derivation of coarse-grained fluctuation theorems that maintain their functional form under arbitrary coarse-graining transformations, (ii) the proof that pointwise mutual information between coarse-grained states captures entropy production within trajectory ensembles, and (iii) the establishment of hierarchical relationships between information measures at different observational scales.
These results have profound implications for understanding information processing in thermodynamic systems. The scale-independence of these fluctuation theorems establishes their fundamental character as universal principles governing energy-information conversion mechanisms, regardless of the level of system description. This universality is particularly significant for biological systems, where information processing occurs across multiple organizational scales, from molecular interactions to cellular signaling networks.
The robust theoretical framework developed here provides a foundation for analyzing the thermodynamics of dynamic molecular information processes and dynamic allosteric transitions in complex biological systems. Furthermore, our findings offer new perspectives on the thermodynamic constraints that govern information processing in classical stochastic systems, with potential applications ranging from molecular machinery design to understanding the fundamental limits of biological computation.

Funding

L.J. was supported by the National Research Foundation of Korea Grant funded by the Korean Government (RS-2016-NR017140), and in part by Kwangwoon University Research Grant in 2023.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jarzynski, C. Nonequilibrium equality for free energy differences. Phys. Rev. Lett. 1997, 78, 2690–2693. [Google Scholar] [CrossRef]
  2. Crooks, G.E. Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences. Phys. Rev. E 1999, 60, 2721–2726. [Google Scholar] [CrossRef] [PubMed]
  3. Seifert, U. Entropy production along a stochastic trajectory and an integral fluctuation theorem. Phys. Rev. Lett. 2005, 95, 040602. [Google Scholar] [CrossRef] [PubMed]
  4. Hatano, T.; Sasa, S.i. Steady-state thermodynamics of Langevin systems. Phys. Rev. Lett. 2001, 86, 3463–3466. [Google Scholar] [CrossRef] [PubMed]
  5. Jinwoo, L.; Tanaka, H. Local non-equilibrium thermodynamics. Sci.Rep. 2015, 5, 7832. [Google Scholar] [CrossRef] [PubMed]
  6. Jinwoo, L. Roles of local nonequilibrium free energy in the description of biomolecules. Phys. Rev. E 2023, 107, 014402. [Google Scholar] [CrossRef] [PubMed]
  7. Hummer, G.; Szabo, A. Free energy reconstruction from nonequilibrium single-molecule pulling experiments. Proc. Nat. Acad. Sci. USA 2001, 98, 3658–3661. [Google Scholar] [CrossRef] [PubMed]
  8. Liphardt, J.; Onoa, B.; Smith, S.B.; Tinoco, I.; Bustamante, C. Reversible unfolding of single RNA molecules by mechanical force. Science 2001, 292, 733–737. [Google Scholar] [CrossRef] [PubMed]
  9. Liphardt, J.; Dumont, S.; Smith, S.; Tinoco Jr, I.; Bustamante, C. Equilibrium information from nonequilibrium measurements in an experimental test of Jarzynski’s equality. Science 2002, 296, 1832–1835. [Google Scholar] [CrossRef] [PubMed]
  10. Trepagnier, E.H.; Jarzynski, C.; Ritort, F.; Crooks, G.E.; Bustamante, C.J.; Liphardt, J. Experimental test of Hatano and Sasa’s nonequilibrium steady-state equality. Proc. Nat. Acad. Sci. USA 2004, 101, 15038–15041. [Google Scholar] [CrossRef] [PubMed]
  11. Collin, D.; Ritort, F.; Jarzynski, C.; Smith, S.B.; Tinoco, I.; Bustamante, C. Verification of the Crooks fluctuation theorem and recovery of RNA folding free energies. Nature 2005, 437, 231–234. [Google Scholar] [CrossRef] [PubMed]
  12. Alemany, A.; Mossa, A.; Junier, I.; Ritort, F. Experimental free-energy measurements of kinetic molecular states using fluctuation theorems. Nature Phys. 2012. [Google Scholar] [CrossRef]
  13. Ponmurugan, M. Generalized detailed fluctuation theorem under nonequilibrium feedback control. Physical Review E 2010, 82, 031129. [Google Scholar] [CrossRef] [PubMed]
  14. Horowitz, J.M.; Vaikuntanathan, S. Nonequilibrium detailed fluctuation theorem for repeated discrete feedback. Physical Review E 2010, 82, 061120. [Google Scholar] [CrossRef] [PubMed]
  15. Sagawa, T.; Ueda, M. Generalized Jarzynski equality under nonequilibrium feedback control. Phys. Rev. Lett. 2010, 104, 090602. [Google Scholar] [CrossRef] [PubMed]
  16. Horowitz, J.M.; Parrondo, J.M. Thermodynamic reversibility in feedback processes. EPL (Europhysics Letters) 2011, 95, 10005. [Google Scholar] [CrossRef]
  17. Sagawa, T.; Ueda, M. Fluctuation theorem with information exchange: Role of correlations in stochastic thermodynamics. Phys. Rev. Lett. 2012, 109, 180602. [Google Scholar] [CrossRef] [PubMed]
  18. Zeng, Q.; Wang, J. New fluctuation theorems on Maxwell?s demon. Science Advances 2021, 7, eabf1807. [Google Scholar] [CrossRef] [PubMed]
  19. Yan, L.L.; Bu, J.T.; Zeng, Q.; Zhang, K.; Cui, K.F.; Zhou, F.; Su, S.L.; Chen, L.; Wang, J.; Chen, G.; et al. Experimental Verification of Demon-Involved Fluctuation Theorems. Phys. Rev. Lett. 2024, 133, 090402. [Google Scholar] [CrossRef] [PubMed]
  20. Hartwell, L.H.; Hopfield, J.J.; Leibler, S.; Murray, A.W. From molecular to modular cell biology. Nature 1999, 402, C47. [Google Scholar] [CrossRef] [PubMed]
  21. Crofts, A.R. Life, information, entropy, and time: vehicles for semantic inheritance. Complexity 2007, 13, 14–50. [Google Scholar] [CrossRef] [PubMed]
  22. Cheong, R.; Rhee, A.; Wang, C.J.; Nemenman, I.; Levchenko, A. Information transduction capacity of noisy biochemical signaling networks. science 2011, 334, 354–358. [Google Scholar] [CrossRef] [PubMed]
  23. McGrath, T.; Jones, N.S.; ten Wolde, P.R.; Ouldridge, T.E. Biochemical Machines for the Interconversion of Mutual Information and Work. Phys. Rev. Lett. 2017, 118, 028101. [Google Scholar] [CrossRef] [PubMed]
  24. Ouldridge, T.E.; Govern, C.C.; ten Wolde, P.R. Thermodynamics of Computational Copying in Biochemical Systems. Phys. Rev. X 2017, 7, 021004. [Google Scholar] [CrossRef]
  25. Becker, N.B.; Mugler, A.; ten Wolde, P.R. Optimal Prediction by Cellular Signaling Networks. Phys. Rev. Lett. 2015, 115, 258103. [Google Scholar] [CrossRef] [PubMed]
  26. Cheng, F.; Liu, C.; Shen, B.; Zhao, Z. Investigating cellular network heterogeneity and modularity in cancer: a network entropy and unbalanced motif approach. BMC Systems Biology 2016, 10, 65. [Google Scholar] [CrossRef] [PubMed]
  27. Whitsett, J.A.; Guo, M.; Xu, Y.; Bao, E.L.; Wagner, M. SLICE: determining cell differentiation and lineage based on single cell entropy. Nucleic Acids Research 2016, 45, e54–e54. [Google Scholar] [CrossRef] [PubMed]
  28. Statistical Dynamics of Spatial-Order Formation by Communicating Cells. iScience 2018, 2, 27–40. [CrossRef] [PubMed]
  29. Maire, T.; Youk, H. Molecular-Level Tuning of Cellular Autonomy Controls the Collective Behaviors of Cell Populations. Cell Systems 2015, 1, 349–360. [Google Scholar] [CrossRef] [PubMed]
  30. Mehta, P.; Schwab, D.J. Energetic costs of cellular computation. Proceedings of the National Academy of Sciences 2012, 109, 17978–17982. [Google Scholar] [CrossRef] [PubMed]
  31. Govern, C.C.; ten Wolde, P.R. Energy dissipation and noise correlations in biochemical sensing. Physical review letters 2014, 113, 258102. [Google Scholar] [CrossRef] [PubMed]
  32. Jinwoo, L. Fluctuation Theorem of Information Exchange between Subsystems that Co-Evolve in Time. Symmetry 2019, 11, 433. [Google Scholar] [CrossRef]
  33. Jinwoo, L. Fluctuation Theorem of Information Exchange within an Ensemble of Paths Conditioned on Correlated-Microstates. Entropy 2019, 21. [Google Scholar] [CrossRef] [PubMed]
  34. Jarzynski, C. Equalities and inequalities: Irreversibility and the second law of thermodynamics at the nanoscale. Annu. Rev. Codens. Matter Phys. 2011, 2, 329–51. [Google Scholar] [CrossRef]
  35. Seifert, U. Stochastic thermodynamics, fluctuation theorems and molecular machines. Rep. Prog. Phys. 2012, 75, 126001. [Google Scholar] [CrossRef] [PubMed]
  36. Spinney, R.; Ford, I. Fluctuation Relations: A Pedagogical Overview. In Nonequilibrium Statistical Physics of Small Systems; Wiley-VCH Verlag GmbH & Co. KGaA, 2013; pp. 3–56. [Google Scholar]
  37. Cover, T.M.; Thomas, J.A. Elements of information theory; John Wiley & Sons, 2012. [Google Scholar]
  38. Kurchan, J. Fluctuation theorem for stochastic dynamics. Journal of Physics A: Mathematical and General 1998, 31, 3719. [Google Scholar] [CrossRef]
  39. Maes, C. The fluctuation theorem as a Gibbs property. Journal of statistical physics 1999, 95, 367–392. [Google Scholar] [CrossRef]
  40. Jarzynski, C. Hamiltonian derivation of a detailed fluctuation theorem. J. Stat. Phys. 2000, 98, 77–102. [Google Scholar] [CrossRef]
  41. Parrondo, J.M.; Horowitz, J.M.; Sagawa, T. Thermodynamics of information. Nature physics 2015, 11, 131–139. [Google Scholar] [CrossRef]
  42. Kawai, R.; Parrondo, J.M.R.; den Broeck, C.V. Dissipation: The phase-space perspective. Phys. Rev. Lett. 2007, 98, 080602. [Google Scholar] [CrossRef] [PubMed]
  43. Generalization of the second law for a transition between nonequilibrium states. Physics Letters A 2010, 375, 88–92. [CrossRef]
  44. Generalization of the second law for a nonequilibrium initial state. Physics Letters A 2010, 374, 1001–1004. [CrossRef]
  45. Esposito, M.; Van den Broeck, C. Second law and Landauer principle far from equilibrium. Europhys. Lett. 2011, 95, 40004. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated