Preprint
Review

This version is not peer-reviewed.

The Rate of Entropy Production in Biophysical-Chemical Systems

Submitted:

29 September 2023

Posted:

29 September 2023

You are already at the latest version

Abstract
An overview of the link between nonequilibrium thermodynamics and complexity theory is offered. It is shown how the rate of entropy production can be quantified through the spectrum of the Lyapunov exponents. It was shown how the entropy production per unit of time meets the necessary and sufficient conditions to be a Lyapunov function and constitutes per se an extremal principle. The entropy production fractal dimension conjecture was established. It is shown how the rate of entropy production as a non-extremal criterion represents an alternative way of sensitivity analysis of differential equations. Finally, an extension to biophysical-chemical systems, on the one hand, the use of the dissipation function is shown, as a thermodynamic potential out of equilibrium, in the characterization of biological phase transitions. On the other hand, it was evidenced how the rate of entropy production represents a physical quantity to evaluate the complexity and robustness of cancer.
Keywords: 
;  ;  ;  ;  ;  ;  
Subject: 
Physical Sciences  -   Biophysics

1. Introduction

The advent of the so-called chaos theory initially [1], and the more recent developments in the sciences of complexity [2], has drastically changed the vision of science, particularly the thermodynamics of irreversible processes.
Although in the linear region of irreversible processes, there is a well-consolidated theory [3,4,5], in the non-linear region, on the one hand, a formalism for it is still pending to be built, on the other, that said formalism also should incorporates complex phenomena.
A first approximation in this direction, linking the thermodynamics of irreversible processes with nonlinear dynamics, was elaborated in the seminal work of Prigogine et al. [6], coined under the name of “dissipative structures”. In the 1990s, Beck & Schlögl published the work entitled "Thermodynamics of chaotic systems" [7], where an approach to the subject is made. Although it is far from having a finished formalism, it is no less true that from these works the first steps have been taken in this direction.
At present-day, there is an extensive list of works in the literature that address the issue of the relationship between nonequilibrium thermodynamics and complex phenomena [8,9,10]. A thermodynamic formalism of complex phenomena should be able to provide answers to three fundamental aspects, namely: 1. Formulate extremal principle for complex phenomena on a macroscopic scale; 2. Establish methods to determine stability in nonequilibrium states; 3. Formalize criteria to characterize the complexity at the macroscopic level of natural systems.
This work aims to offer a unifying overview of the relationship between nonequilibrium thermodynamics and non-linear dynamics, which, even far from establishing a finished formalism, serves as a starting point for what could constitute the theoretical bases of the “thermodynamics of complex phenomena”. The work is structured as follows: In Section 2, the fundamental aspects of the formalism of the thermodynamics of irreversible processes in the linear region are summarized; an overview of the advances between nonequilibrium thermodynamics and complex phenomena is shown in Section 3; Section 4 provides an extension to biophysical-chemical systems; Finally, in Section 5, the Conclusions and observations are presented.

2. The Formalism of the Thermodynamics of Irreversible Processes in the Linear Region

Already from the seminal works of Onsager [11,12], of Groot-Mazur [4], and Prigogine [3], the bases of the thermodynamics of irreversible processes were established. This formalism is established on four fundamental pillars:
δ S i d t S ˙ i 0 ,
1. Accept as a fundamental postulate that the production of entropy per unit of time δ S i d t , is positive definite, that is:
2. Validity of the Onsager reciprocity relations.
3. Fulfillment of the “local equilibrium” hypothesis.
4. The existence of linear relationships between flows and forces.
In this way, the fundamental expression of the Second Law can be generalized, as
d S S d t = δ S e d t + δ S i d t ,
where d S S d t S ˙ S is the entropy rate of the system, δ S e d t S ˙ e is the rate of entropy exchange with the surroundings or entropy flow and δ S i d t S ˙ i is the rate of entropy production. The Eq. (2) can be rewritten as
S ˙ S = S ˙ e + S ˙ i ,
Thus, the evolution criterion can be generalized as: S ˙ i > 0 , which constitutes one of the postulates on which the formalism of irreversible processes rest and the essence of the Second Law. Additionally, it gives a physical meaning to time, which has been coined in the literature as The Arrow of Time [13].
Sometimes it is convenient, as we will see later, to use instead of the rate of entropy production, the so-called dissipation function introduced by Lord Rayleigh, Ψ T S ˙ i , since it converts the rate of entropy production into an out-of-equilibrium thermodynamic potential.
Formally the rate of production of entropy, S ˙ i , can be evaluated as
S ˙ i = k J k X k ,
where, J k represents generalized flows, e.g. heat flow, substance flow, etc., and X k are the generalized forces, that is, the causes that give rise to the appearance of flows, temperature gradients, substances, etc.
Between the flows and the generalized forces, a linear relationship can be established, known as the phenomenological or constitutive relationship [4], which, were established empirically, long before the formal structure of the thermodynamics of irreversible processes was established. Hence, we have
J k = L k k X k ,
where, L k k is known as a direct phenomenological coefficient, for example, the coefficient of thermal conductivity, λ , diffusion coefficient, D , etc. The formal structure of the thermodynamics of irreversible linear processes is based on the existence of equality, Eq. (5), that is, the validity of linear relationships between generalized forces and flows. When there is no such phenomenological relationship, we speak of the non-linear region. It is important to highlight the fact that linearity in dynamic systems should not be confused with the existence of the linear dependence between flows and generalized forces, Eq. (5).
Of great importance are the coupling or interference processes [14], which are subject to the Curie Principle of symmetry [14], for example, given any two processes that are coupled under the Curie Principle, such that
J 1 = L 11 X 1 + L 12 X 2 , J 2 = L 21 X 1 + L 22 X 2 ,
where L 11 , L 22 are the straight phenomenological coefficients and L 12 , L 21 are known as cross-phenomenological coefficients. As we mentioned previously, point 2, concerning the so-called Onsager Reciprocity Principle, it is true that
L 12 = L 21 ,
In other words, the so-called Onsager Reciprocity Principle [4,11,12] establishes that: whenever an appropriate choice is made for the flows J k and the forces X k , the matrix of phenomenological coefficients is symmetric. Thus, considering Eqs. (6, 7) and substituting them in Eq. (4) we have that, the rate of production of entropy for the coupling is given by
S ˙ i = L 11 X 1 2 + ( L 12 + L 21 ) X 1 X 2 + L 22 X 2 2 = L 11 X 1 2 + 2 L 12 X 1 X 2 + L 22 X 2 2 0. ,
The Eq. (8) is a semi-positive definite quadratic form by virtue of the Second Law. Linear algebra imposes restrictions on the phenomenological coefficients in formula (8), it must be true that
L 11 > 0 , L 22 > 0 ( L 12 + L 21 ) 2 < 4 L 11 L 22 . ,
That is, the straight coefficients are always positive magnitudes, while the crossed ones can take any value, as long as the inequality of the last expression of Eq. (9).
The stationary states, also known as fixed points in the theory of dynamical systems [15], are states through which different processes, physical, chemical, biological, etc., transit [4,5] and are of special interest in the framework of the theory of complexity sciences [2].
Formally a dynamical system can be defined as the ordered pair ( Ε , Τ t ) where Ε represents an appropriate manifold and Τ t is a one-parameter group of diffeomorphisms under the parameter t often represented by the time. If one has an atlas of local charts for the manifold E, on those charts it is possible a representation of the dynamical system in the following form: X ˙ ( t ) = F ( X ( t ) ) , where F is the vector field associated with the one-parameter group of diffeomorphisms.
It is said that the solution X ( t ) = X 0 is an equilibrium position or a stationary state of the system if F ( X 0 ) = 0 . We further say that X 0 is an attractor of the system, if for any other solution X ( t ) , whose initial conditions are close enough to X 0 , we have X ( t ) X 0 when t .
From nonequilibrium thermodynamics a point of view [4], a stationary state is formally defined as that dynamic state, for which it is true that during a finite time, the state variables and the control parameters remain constant, and dissipative flows are verified, that is to say S ˙ i > 0 , in such a way that
S ˙ i = S ˙ e ,
That is, at the same rate that entropy is produced S ˙ i , exchanges with surroundings S ˙ e , in such a way that S ˙ S = 0 . Furthermore, steady states are characterized by the number of forces k that remain constant, hence the stationary states of an order made references to k [4]. For instance, in the Eq. (8) assuming there is a steady state, for X 2 constant, that is, of order one, k = 1 , we should have to verify Prigogine's Theorem of Minimum Entropy Production or Prigogine's Principle [14], which ensures the stability of the stationary state, that is, out of equilibrium, which constitutes an extension of the stability criterion in the vicinity of the equilibrium, Gibbs-Duhem Principle [16]. In this way, Prigogine's Principle represents, in fact, an extremal principle, if the linear relationships between flows and forces are fulfilled, Eq. (5).
Glansdorff and Prigogine tried to generalize Prigogine's Principle, known as the “general criterion of evolution” [17], demonstrating how the rate of entropy production, Eq. (8), constitutes from physics, a natural Lyapunov function [18]. According to the procedure proposed by Glansdorff and Prigogine, the entropy production per unit of time S ˙ i is identified as a Lyapunov function, V ( x ) , S ˙ i V ( x ) , such that
S ˙ i V ( x ) 0 , d S ˙ i d t 0. ,
The Eulerian derivative of the entropy production rate, Eq. (4), is given by
d S ˙ i d t = k J k d X k d t + k X k d J k d t = d X ( S ˙ i ) d t + d J ( S ˙ i ) d t ; ,
Considering Eq. (8) and substituting in Eq. (12), one has:
d X ( S ˙ i ) d t = J 1 d X 1 d t + J 2 d X 2 d t , d J ( S ˙ i ) d t = X 1 d J 1 d t + X 2 d J 2 d t ; ,
Taking into account Eqs. (6), (7), and (13), and substituting in Eq. (12) is obtained
d ( S ˙ i ) d t = 2 J 1 d X 1 d t + 2 J 2 d X 2 d t , = 2 d X ( S ˙ i ) d t , 1 2 d ( S ˙ i ) d t < 0 ; ,
In this way it is demonstrated, formula (14), as the production of entropy per unit of time, is a physical magnitude that constitutes per se a Lyapunov function, as long as there is a linear dependence between the flows and the generalized forces. As can be seen, the general criterion of evolution, formula (14), is restricted to the linear region of irreversible processes.

3. Thermodynamic Formalism of Complex Processes

As we commented at the beginning, unlike the formalism of the thermodynamics of irreversible processes in the linear region, where most of its precepts are consolidated, in the nonlinear region is still in the making, due to this, it is still premature to speak of a finished formalism. That is why we intend to provide a landscape approach to the subject and, above all, try to articulate the thermodynamic formalism of irreversible processes with that of nonlinear dynamics, in such a way that it allows us to offer a thermodynamic approach to the complex’s phenomena [19].
On the one hand, it is important to be clear about what we are referring to as complex, hence, the complexity manifested by dynamical systems highlights the following general and essential aspects to understand this phenomenon:
1. Complex, should not be seen as synonymous with complicated, since a system described by few degrees of freedom can exhibit high complexity during its evolution, on the contrary, a system that requires many degrees of freedom to be able to describe itself and which is therefore complicated, may or may not exhibit complex behavior.
2. Complexity manifests itself through the appearance of emergent properties. These are macroscopic observables that cannot always be deduced from the interaction rules that govern the evolution of the different components of the systems.
3. The dimension of the patterns, both temporal and spatial, is generally not an integer and is greater than its topological dimension; therefore, they are said to have a fractal dimension [2].
4. The complex processes described through deterministic dynamic systems, on many occasions, show a sensitive dependence with respect to the initial conditions, a behavior that can be confused with stochastic processes, and known as deterministic chaos [20]. The most important consequence of this property is the impossibility of making predictions about the evolution of the system in the long term. In other words, the so-called Laplacian determinism collapses.
5. For a deterministic dynamic system to exhibit complex behavior, it must meet two fundamental requirements: that it be nonlinear and that feedback processes exist [21].
6. The fundamental mechanism that describes the emergent properties and complexity of a system is based on the occurrence of bifurcations [22,23], a dynamic analog of phase transitions. The bifurcations exhibit a universal character in terms of their phenomenology [24], which makes them independent of the characteristics of the system and represents a source of innovation and diversification because it gives systems a new type of solutions. The fluctuations, which have a microscopic origin, grow, and amplify until they reach the macroscopic level, which leads to a break in the space/time symmetry, giving rise to self-organization outside of thermodynamic equilibrium, the establishment of order, and coherence, on a macroscopic scale, and consequently to the appearance of complexity.
Hence, the term complex should not be seen as a synonym for complicated, that is, dynamic systems self-organize temporally and/or spatially out of thermodynamic equilibrium, a term coined by Prigogine as Dissipative Structures [25], which gives rise to the manifestation of complex phenomena.
On the other hand, Seth Lloyd compiled an extensive list, still incomplete, of ways to measure complexity [26], among which are: Shannon, Gibbs-Boltzmann, Renyi, Tsallis, Kolmogorov-Sinai entropies, fractal dimension, among others.
Concerning the thermodynamic formalism of irreversible processes, even today, there is a great controversy concerning Prigogine's Principle, the Principle of entropy production. According to Bruers [27], at least “six principles” can be mentioned: 1. Principle of minimum dissipation close to equilibrium; 2. Principle of minimum production of entropy near equilibrium; 3. Principle of maximum production of entropy near equilibrium; 4. Non-variational principle far from the equilibrium of maximum production of entropy; 5. Variational principle far from the equilibrium of maximum production of entropy; 6. Optimization of the principle of minimum production of entropy.
Chemical reactions constitute an ideal model to delve into the subject, since, firstly, they can occur "close to or far" from thermodynamic equilibrium, and, secondly, there is no linear relationship between the generalized flow, the rate of reaction ξ ˙ , and generalized force, an affinity for the inverse of temperature 1 T A . Furthermore, their dynamics exhibit a wide range of temporal and/or spatial complexity [28] and the developed formalism can be extended to biological systems.
Briefly, we will show how it is possible to generalize, at least for chemical and biological processes, the “general criterion of evolution” of Glansdorff-Prigogine [17], demonstrating how the rate of entropy production is a Lyapunov function, without the need that the linear relationships between flows and forces hold.
At the end of the 19th century, 1892, Lyapunov in his Doctoral Thesis [18], developed a mathematical method that allows knowing the evolution and global stability of a dynamical system, known as the Lyapunov function. Thus, we have succinctly that:
Let p be a fixed point, steady state of a flow d x d t x ˙ = f ( x ) , such that, if for some neighborhood N of p the following conditions hold:
1. V ( x ) > 0 for every x p in N and V ( p ) = 0 ;
2. The Eulerian derivative, d V ( x ) d t 0 for every x in N .
The function V ( x ) is called Lyapunov´s function. Thus, it can be stated that for all t t 0 , p is stable, and if d V ( x ) d t < 0 , the equilibrium position is asymptotical stable.
On the one hand, we show that the entropy production per unit time, at least for chemical reactions, meets the necessary and sufficient conditions of a Lyapunov function [29], and in fact constitutes an extremal criterion per se, regardless of whether the network of chemical reactions is "near" or "far" from equilibrium. Recently, it has been demonstrated in reaction-diffusion-type systems [30].
On the other hand, it was shown [31,32,33,34] by means of an Ansatz through a functional of the rate of entropy production of the control parameters of the dynamic system, Ω , as
S ˙ i = f ( Ω ) > 0 ,
Thus, it is found that the Eulerian derivative of Eq. (15) holds the following:
d S ˙ i d t = S ˙ i Ω d Ω d t 0 ,
In this way, we have that the acceleration of the production of entropy rate, d S ˙ i d t , constitutes per se a potential function out of equilibrium.
The works of Hoover and Nose [35,36], and Gaspard [37] showed that the rate of entropy production S ˙ i is related to the spectrum of the Lyapunov exponents λ j through the relationship,
d S i d t S ˙ i j λ j > 0 ,
The formula, Eq. (17) establishes per se a natural link between the formalism of the thermodynamics of irreversible processes and nonlinear dynamics regardless of whether the system evolves "close" or "far" from thermodynamic equilibrium.
It is known that, in order to determine the fundamental steps in a reaction mechanism, sensitivity analysis of differential equations has been used successfully [38]. Edelson's pioneering work [39,40,41] allowed not only the identification of the fundamental steps in a mechanism but also its reduction. Turanyi later used the method in the famous Belousov-Zhabotinsky BZ reaction [42,43,44] achieving a drastic reduction in the model mechanism, GTF, from 81 to 42 steps.
As an alternative method to the sensitivity analysis, we proposed the use of the rate of entropy production, as a non-extremal criterion, called the Method of Dominant Steps [34,45,46]. For this, we postulate that: those steps that exhibit a greater value of entropy production would be the fundamental ones in a reaction mechanism, for fixed values of the control parameters.
Let be a mechanism composed of n-reaction steps and m-species, represented by equality (18), as
  x 1 / i = x 2 / i x m 1 / n = x m / n ,
Thus, we have that the rate of production of entropy, of the step-n is given by
S ˙ i / n = R ( ξ ˙ + / n ξ ˙ / n ) ln ξ ˙ + / n ξ ˙ / n 0 ,
Step n will be dominant compared to step n-1 if it is fulfilled that: S ˙ i / n > S ˙ i / n 1 . In this way, the rate of entropy production, as a non-extremal criterion, generalizes the so-called “maximum entropy” criterion later proposed by Martyushev and Seleznev [47] and constitutes a complementary method to the sensitivity analysis of differential equations.
The fractal dimension D f represents one of the most important properties of an attractor of a dynamic system and a way to estimate the complexity of spatiotemporal patterns from the geometric point of view [48], as we mentioned at the beginning of this section. Grassberger [49] proposed a generalization of the fractal dimension, the generalized fractal dimension D q as
D q = lim ε S q ( R ) ln ( 1 ε ) ,
where S q ( R ) is the Renyi´s entropy [50]. From the formula, Eq. (20) three basic dimensions are obtained as particular cases: D 0 , D 1 , D 2 ; the Hausdorff-Besicovitch fractal dimension D 0 , the informational dimension [51], D 1 = lim q 1 D q , and the correlation dimension D 2 . In the case of fractals, the three dimensions are approximately equal, while in multifractals it is true that: D 0 > D 1 > D 2 [52].
An alternative and simple way to compute the fractal dimension of a dynamical system is through the spectrum of Lyapunov exponents. λ j , known as the Lyapunov dimension D L defined through the Kaplan-York conjecture [53], as:
D L = j + i = 1 j λ i | λ j + 1 | ,
where j is the largest integer for which it is true that: λ 1 + λ 2 + + λ j 0 . By analogy to Eq. (21), we established through an Ansatz the following conjecture: the fractal dimension of entropy production [54], defined as:
D S ˙ i = j + S ˙ i ( i = j + 1 n λ i ) ,
where the entropy production per unit time S ˙ i , is evaluated through the formula (17), n is the number of all Lyapunov exponents.

4. Extension to Biophysical-Chemical Systems

Finally, we will make a brief incursion from the thermodynamic formalism of complex processes to biological systems, particularly in the topic related to the emergence and evolution of cancer. Non-equilibrium thermodynamics has been successfully used in studies of longevity, aging, the origin of life, and in particular cancer [55,56,57,58,59,60,61,62,63,64,65,66,67,68,69].
We must start with a formal definition: …cancer is a complex network of cells that have lost their specialization and control of growth and that appears through a “biological phase transition”, leading to spatiotemporal self-organization outside the thermodynamic equilibrium, which exhibits high robustness, adaptability, complexity, and hierarchy, which enables the creation of new information and learning capacity... [70].
The diagnosis of the proliferative and invasive capacity of a tumor is a complicated issue since these terms include a many of factors, of which we can highlight two fundamental ones: aggressiveness, which is related to the speed of tumor growth, and malignancy the ability of the tumor to invade and infiltrate healthy tissue, associated with its morphological characteristics (roughness) [71].
The growth rate of the tumor, ξ ˙ is given by
ξ ˙ = ξ ˙ m ξ ˙ a p ,
where ξ ˙ m , ξ ˙ a p are the rates of mitosis (cell division) and apoptosis (programmed cell death) respectively. By analogy to Eq. (19), we can evaluate the production of entropy per unit of time S ˙ i , during the growth of a tumor [72] as
S ˙ i = ( ξ ˙ m ξ ˙ a p ) ln ξ ˙ m ξ ˙ a p 0 ,
On the other hand, we developed a method based on knowing the rates of mitosis ξ ˙ m and apoptosis ξ ˙ a p [73] to quantify morphological characteristics (roughness) of the tumor, the malignancy of a tumor, through the fractal dimension D f , as
D f = ( 5 ξ ˙ a p ξ ˙ m ξ ˙ m + ξ ˙ a p ) ,
Considering Eqs. (23) and (25), we can rewrite Eq. (24) depending on the rate of tumor growth, ξ ˙ and the fractal dimension of the tumor D f as
S ˙ i = R ξ ˙ ln ( 5 D f 1 + D f ) ,
In this way, an appropriate expression is obtained, Eq. (26), to evaluate the production of entropy per unit of time S ˙ i , during the emergence and evolution of cancer, which relates to two fundamental properties of tumors: aggressiveness and malignancy [72]. Thus, we can affirm that the production of entropy per unit of time represents a physical quantity to evaluate the complexity and robustness (the ability of a system to continue functioning in the face of internal or external perturbations or fluctuations) of cancer.
Landau's seminal work [74] proposes a theory of continuous phase transitions, in which symmetry breaking occurs in the vicinity of the critical point. In correspondence with the formalism proposed by Landau, a potential function is defined Φ known as the Landau potential. The Landau potential Φ is defined in terms of the state variables that characterize the system, for example, temperature and pressure, as well as, as a function of the so-called order parameter η , which is empirically defined.
To formalize out-of-equilibrium phase transitions, a term we coined as biological phase transition [75], during the emergence and evolution of cancer, we selected the dissipation function, Ψ T S ˙ i , which is a non-equilibrium thermodynamic potential as an analog of the Landau potential Φ .
Thus we have that, in the case of the emergence and evolution of cancer, biological phase transition, is selected as an order parameter η , the difference between the fractal dimension of healthy cells D f H and the fractal dimension of tumor cells D f T , such that:
η = D f H D f T ,
Thus, we have that at the critical point P C it holds that η = 0 and so on in any other "ordered" phase η 0 . In this way, the order parameter η is called the degree of complexity [75].
Considering Eqs. (27) and (26), and making a power series expansion of the dissipation function Ψ , assuming for simplicity that D f H = 1 , is obtained
Ψ = 1. 785 6 ξ ˙ + 0.241 51 ξ ˙ η 2 + 3. 773 6 × 10 2 ξ ˙ η 4 , Ψ ( ξ ˙ , d f C ) = Ψ 0 ( ξ ˙ , d f C ) + α ( ξ ˙ , d f C ) η 2 + β ( ξ ˙ , d f C ) η 4 ;
Eq. (28) represents an out-of-equilibrium extension of Landau's Theory and allows formalizing biological phase transitions through non-equilibrium thermodynamics. In this way, we understand how the development of a primary tumor from a microscopic level, an avascular growth, to a macroscopic level, vascular phase, and the subsequent appearance of metastases, is not simply the accumulation of malignant cells, but occurs through bifurcations, i.e., through a biological phase transition [75,76,77,78,79,80,81,82].

5. Conclusions and Remarks

In summarize, we have seen how non-equilibrium thermodynamics and nonlinear dynamics articulate coherently, which allows us to establish a formal path of what could become the thermodynamics of complex processes. As essential aspects, it was shown that:
1. On the one hand, the rate of entropy production is a physical magnitude that represents a Lyapunov function per se, regardless of whether the dynamic system is close to or far from equilibrium, and therefore constitutes an extremal criterion.
2. On the other hand, the rate of entropy production constitutes a complementary method to the sensitivity analysis of differential equations and appears as a non-extremal criterion.
3. An extension of the formalism to biophysical-chemical systems, on the one hand, shows the use of the dissipation function, as a non-equilibrium thermodynamic potential, in the characterization of biological phase transitions.
4. On the other hand, it was evidenced how the rate of entropy production represents a physical magnitude to evaluate the complexity and robustness of cancer.

Author Contributions

Both authors contributed equally to the work.

Funding

JMNV's work was partially funded by PREI-DGAPA-UNAM-2022.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

Prof. Dr. Germinal Cocho and Prof. Dr. A. Alzola in memoriam. One of the authors (JMNV) thanked the CEPHClS of UNAM Mexico, for the warm hospitality and the financial support by PREI-DGAPA-UNAM-2022.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Schuster, H.G.; Just, W. Deterministic Chaos: An Introduction; Wiley-VCH: Weinheim, 2006. [Google Scholar]
  2. Nicolis, G.; Nicolis, C. Foundations of Complex Systems. Nonlinear Dynamics. Statistical Physics. Information and Prediction; World Scientific Publishing Co. Pte. Ltd: Singapore, 2007. [Google Scholar]
  3. Prigogine, I. Etude Thermodynamique des Phenomenes irreversibles, Theses d´agregation de l´Enseignement superieur de l´Universite Libre de Bruxelles, Dunod, Editeurs Paris y Editions Desoer Liege, 1947.
  4. De Groot, S.R.; Mazur, P. Non-Equilibrium Thermodynamics; North-Holland Publishing Company: Amsterdam, 1962. [Google Scholar]
  5. Katchalsky, A.; Curran, P. Non-Equilibrium Thermodynamics in Biophysics; Harvard University Press: Cambridge, 1965. [Google Scholar]
  6. Nicolis, G.; Prigogine, I. Self-Organization in Nonequilibrium systems; Wiley: New York, 1977. [Google Scholar]
  7. Beck, C.; Schlögl, F. Thermodynamics of Chaotic Systems: An Introduction; Cambridge University Press: New York, 1993. [Google Scholar]
  8. Gaspard, P.; Henneaux, M.; Lambert, F.; Editors. From dynamical systems theory to nonequilibrium thermodynamics. Symposium Henri Poincaré, Proceedings International Solvay Institutes for Physics and Chemistry, Brussels, 2007, 97-119.
  9. Nicolis, G.; De Decker, Y. Stochastic approach to irreversible thermodynamics. Chaos 2017, 27, 104615. [Google Scholar] [CrossRef] [PubMed]
  10. Nicolis, G.; Nicolis C. What can we learn from thermodynamics on stochastic and chaotic dynamics? in Stochastic and chaotic dynamics in the lakes; D. Broomhead, E. Luchinskaya, P. Mc Clintock and I. Mullin (eds), American Institute of Physics, 2000.
  11. Onsager, L. Reciprocal Relations in Irreversible Processes I. Physical Review 1931, 37, 405–426. [Google Scholar] [CrossRef]
  12. Onsager, L. Reciprocal Relations in Irreversible Processes I. Physical Review 1931, 38, 2265–2279. [Google Scholar] [CrossRef]
  13. Coveney, P.; Highfields R. The Arrow of Time: A Voyage Through Science to Solve Time’s Greatest Mystery, Fawcett, 1st Ed., 1991.
  14. Prigogine, I. Introduction to Thermodynamics of Irreversible Processes; Wiley: New York, 1961. [Google Scholar]
  15. Andronov, A.; Vit, A.; Chaitin, C. Theory of Oscillators; Pergamon Press: Oxford, 1966. [Google Scholar]
  16. Kondepudi, D.; Prigogine, I. Modern Thermodynamics, From Heat Engines to Dissipative Structures; John Wiley & Sons, 1998. [Google Scholar]
  17. Glansdorff, P.; Prigogine, I. Thermodynamics of Structure, Stability and Fluctuations; Wiley: New York, 1971. [Google Scholar]
  18. Mawhin, J. The early reception in France of the work of Poincaré and Lyapunov in the qualitative theory of differential equations. Philosophia Scientiæ 1996, 1, 119–133. [Google Scholar]
  19. Mansilla, R. & Nieto-Villar, J.M. (coordinadores). La Termodinámica de los sistemas complejos; UNAM, 2017.
  20. Strogatz, S.H. Nonlinear dynamics and chaos; Westview Press: Boulder, 2000. [Google Scholar]
  21. Nieto-Villar, J.M.; Betancourt-Mar, J.; Izquierdo-Kulich, E.; Tejera E. Complejidad y Auto-organización en Patrones Naturales; editorial UH, 2013.
  22. Nicolis, G. Fluctuations Around Nonequilibrium States in Open Nonlinear Systems. J. Stat. Phys. 1972, 6, 195. [Google Scholar] [CrossRef]
  23. Nicolis, G.; Daems, D. Probabilistic and thermodynamic aspects of dynamical systems. Chaos: An Interdisciplinary Journal of Nonlinear Science 1998, 8, 311–320. [Google Scholar] [CrossRef]
  24. Kuznetsov, Y. A. Elements of applied bifurcation theory; vol. 112, Springer Science & Business Media, 2013.
  25. Prigogine, I. Time, Structure, and Fluctuations. Science 1975, 201, 777–785. [Google Scholar] [CrossRef]
  26. Lloyd, S. Measures of complexity: a no exhaustive list. IEEE, Control Systems 2001, 21, 7–8. [Google Scholar]
  27. Bruers, S. Classification and Discussion of Macroscopic Entropy Production Principles. <http://arxiv.org/abs/cond-mat/0604482>, 2006.
  28. Nieto-Villar, J. M.; Velarde, M. G. Chaos and Hyperchaos in a Model of the Belousov-Zhabotinsky Reaction in a Batch Reactor. Journal of Non-Equilibrium Thermodynamics, 2001, 25, 269–278. [Google Scholar] [CrossRef]
  29. Nieto-Villar, J.M.; Quintana, R.; Rieumont, J. Entropy Production Rate as a Lyapunov Function in Chemical Systems: Proof. Physica Scripta 2003, 68, 163–165. [Google Scholar] [CrossRef]
  30. Ledesma-Durán, A.; Santamaría-Holek, I. Energy and Entropy in Open and Irreversible Chemical Reaction–Diffusion Systems with Asymptotic Stability. Journal of Non-Equilibrium Thermodynamics 2022, 47, 311–328. [Google Scholar] [CrossRef]
  31. Nieto-Villar, J.M.; García, J.M.; Rieumont, J. Entropy Production Rate as an Evolutive Criteria in Chemical Systems. I. Oscillating Reactions. Physica Scripta 1995, 5, 30. [Google Scholar] [CrossRef]
  32. García, J.M.; Nieto-Villar, J.M.; Rieumont, J. Entropy Production Rate as an Evolutive Criteria in Chemical Systems. II. Chaotic Reactions. Physica Scripta 1996, 53, 643. [Google Scholar] [CrossRef]
  33. Nieto-Villar, J.M.; Izquierdo-Kulich, E.; Quintana, R.; Rieumont, J. Una aproximación del criterio evolutivo de Prigogine a sistemas químicos. Rev. Mex. Fis. 2013, 59, 527. [Google Scholar]
  34. Nieto-Villar, J.M.; Rieumont, J.; Mansilla, R. The entropy production rate a bridge between thermodynamics and chemical kinetics. Rev. Mex. de Fís. E 2022, 19, 010212. [Google Scholar] [CrossRef]
  35. Hoover, W. G.; Posch, H.A. Second-law irreversibility and phase-space dimensionality loss from time-reversible nonequilibrium steady-state Lyapunov spectra. Physical Review E. 1994, 49, 1913. [Google Scholar] [CrossRef]
  36. Hoover, W.G. Nosé–Hoover nonequilibrium dynamics and statistical mechanics. Molecular Simulation. 2007, 33, 13–19. [Google Scholar] [CrossRef]
  37. Gaspard, P. Time asymmetry in nonequilibrium statistical mechanics. Advances in Chemical Physics. 2007, 135, 83–134. [Google Scholar]
  38. Varma, A.; Morbidelli, M.; Wu, H. Parametric sensitivity in chemical systems; ed.; Cambridge University Press; 2005.
  39. Edelson, D.; Allara, D.L. A computational analysis of the alkane pyrolysis mechanism: Sensitivity analysis of individual reaction steps. International Journal of Chemical Kinetics. 1980, 12, 605–621. [Google Scholar] [CrossRef]
  40. Edelson, D.; Thomas, V.M. Sensitivity analysis of oscillating reactions. 1. The period of the Oregonator. The Journal of Physical Chemistry. 1981, 85, 1555–1558. [Google Scholar] [CrossRef]
  41. Edelson, D. Sensitivity analysis of proposed mechanisms for the Briggs-Rauscher oscillating reaction. The Journal of Physical Chemistry. 1983, 87, 1204–1208. [Google Scholar] [CrossRef]
  42. Turányi, T. Sensitivity analysis of complex kinetic systems. Tools and applications. Journal of mathematical chemistry. 1990, 5, 203–248. [Google Scholar] [CrossRef]
  43. Gyorgy, L.; Turányi, T.; Field, R. J. Mechanistic Details of the Oscillatory Belousov-Zhabotinsky Reaction. J. Phys. Chem. 1990, 94, 7162–7170. [Google Scholar] [CrossRef]
  44. Turányi, T.; Gyorgy, L.; Field, R. J. Analysis and Simplification of the GTF Model of the Belousov-Zhabotinsky Reaction. J. Phys. Chem. 1993, 97, 1931–1941. [Google Scholar] [CrossRef]
  45. Rieumont-Briones, J.; Nieto-Villar, J.M.; García, J.M. The Rate of Entropy Production as a Mean to Determine the Most Important Reaction Steps in Belousov-Zhabotinsky Reaction. Anales Química, International Edition, 1997, 93, 147–152. [Google Scholar]
  46. Nieto-Villar, J.M.; Velarde, M.G. Chaos and Hyperchaos in a Model of the Belousov-Zhabotinsky Reaction in a Batch Reactor. Journal of Non-Equilibrium Thermodynamics 2001, 25, 269–278. [Google Scholar] [CrossRef]
  47. Martyushev, L.M.; Seleznev, V.D. Maximum entropy production principle in physics, chemistry and biology. Phys Rep. 2006, 426, 1–45. [Google Scholar] [CrossRef]
  48. Farmer, J. D. Dimension, fractal measures, and chaotic dynamics. In Evolution of Order and Chaos: in Physics, Chemistry, and Biology Proceedings of the International Symposium on Synergetics at Schloß Elmau, Bavaria, April 26–May 1, 1982, 228-246. 26 April.
  49. Grassberger, P.; Procaccia, I. Characterization of Strange Attractors. Physical Review Letters 1983, 50, 346–349. [Google Scholar] [CrossRef]
  50. Rényi, A. On measures of information and entropy. In Proceedings of the fourth Berkeley Symposium on Mathematics, Statistics and Probability. 1960, 1, 547–561. [Google Scholar]
  51. Farmer, J. D. Information dimension and the probabilistic structure of chaos. Zeitschrift für Naturforschung A 1982, 37, 1304–1326. [Google Scholar] [CrossRef]
  52. Farmer, J. D.; Ott, E.; Yorke, J. A. The dimension of chaotic attractors. Physica D: Nonlinear Phenomena, 1983, 7, 153–180. [Google Scholar] [CrossRef]
  53. Frederickson, P.; Kaplan, J. L.; Yorke, E. D.; Yorke, J. A. The Liapunov dimension of strange attractors. Journal of differential equations, 1983, 49, 185–207. [Google Scholar] [CrossRef]
  54. Betancourt-Mar, J. A.; Rodríguez-Ricard, M.; Mansilla, R.; Cocho, G.; Nieto-Villar, J.M. Entropy production: evolution criteria, robustness and fractal dimension. Rev. Mex. Fis 2016, 62, 164–167. [Google Scholar]
  55. Miquel, J.; Economos, A.C.; Johnson, J.E. A systems analysis—thermodynamic view of cellular and organismic aging. In Aging and Cell Function.; Springer US, 1984; pp. 247–280. [Google Scholar]
  56. Balmer, RT. Entropy and aging in biological systems. Chemical Engineering Communications. 1982, 17, 171–181. [Google Scholar] [CrossRef]
  57. Aoki, I. Entropy principle for human development, growth and aging. Journal of theoretical biology. 1991, 150, 215–223. [Google Scholar] [CrossRef]
  58. Nieto-Villar, J.M.; Rieumont, J.; Quintana, R.; Miquel, J. Thermodynamic approach to the aging process of biological systems. Revista CENIC Ciencias Químicas. 2003, 34, 149–157. [Google Scholar]
  59. Triana, L.; Cocho, G.; Mansilla, R.; Nieto-Villar, J.M. Entropy production as a physical pacemaker of lifespan in mole-rats. International Journal of Aging Research, 2018, 1, 22. [Google Scholar]
  60. Betancourt-Mar, J. A.; Mansilla, R.; Cocho, G.; Nieto-Villar, J.M. On the relationship between aging & cancer. MOJ Gerontol Ger, 2018, 3, 163–168. [Google Scholar]
  61. Montemayor-Aldrete, J. A.; Márquez-Caballé, R. F.; del Castillo-Mussot, M.; &amp, *!!! REPLACE !!!*; Cruz-Peregrino, F.; Cruz-Peregrino, F. General Thermodynamic Efficiency Loss and Scaling Behavior of Eukaryotic Organisms. Biophysical Reviews and Letters 2020, 2020 15, 143–169. [Google Scholar] [CrossRef]
  62. Nieto-Villar, J. M.; Mansilla, R. Longevity, Aging and Cancer: Thermodynamics and Complexity. Foundations. 2022, 2, 664–680. [Google Scholar] [CrossRef]
  63. Michaelian, K. Non-equilibrium thermodynamic foundations of the origin of life. Foundations, 2022, 2, 308–337. [Google Scholar] [CrossRef]
  64. Molnar, J.; et al. Thermodynamic aspects of cancer: possible role of negative entropy in tumor growth, its relation to kinetic and genetic resistance. Letters in Drug Design & Discovery 2005, 26, 429–438. [Google Scholar]
  65. Luo, L. Entropy production in a cell and reversal of entropy flow as an anticancer therapy. Front. Phys. China 2009, 4, 122–136. [Google Scholar] [CrossRef]
  66. Lucia, U. Entropy generation and cell growth with comments for a thermodynamic anticancer approach. Physica A 2014, 406, 107–118. [Google Scholar] [CrossRef]
  67. Lucia, U.; Ponzetto, A.; and Deisboeck, T.S. A thermodynamic approach to the ‘mitosis/apoptosis’ ratio in cancer. Physica A: Statistical Mechanics and its Applications 2015, 436, 246–255. [Google Scholar] [CrossRef]
  68. Marín, D.; Sabater, B. The cancer Warburg effect may be a testable example of the minimum entropy production rate principle. Physical Biology 2017, 14, 024001. [Google Scholar] [CrossRef]
  69. Miranda, L. M.; Souza, A. M. Fractality in tumor growth at the avascular stage from a generalization of the logistic-Gompertz dynamics. Physica A: Statistical Mechanics and its Applications 2023, 2023, 128664. [Google Scholar] [CrossRef]
  70. Montero, S.; Martin, R.; Mansilla, R.; Cocho, G.; Nieto-Villar, J.M. Parameters Estimation in Phase-Space Landscape Reconstruction of Cell Fate: A Systems Biology Approach. Systems Biology. 2018, 125–170. [Google Scholar]
  71. Norton, L. Conceptual and Practical Implications of Breast Tissue Geometry: Toward a More Effective, Less Toxic Therapy. Oncologist 2005, 10, 370. [Google Scholar] [CrossRef]
  72. Izquierdo-Kulich, E.; Alonso-Becerra, E.; Nieto-Villar, J.M. Entropy production rate for avascular tumor growth. Journal of Modern Physics. 2011, 2, 615. [Google Scholar] [CrossRef]
  73. Izquierdo-Kulich, E.; Nieto-Villar, J. M. Morphogenesis and complexity of the tumor patterns. In Without Bounds: A Scientific Canvas of Nonlinearity and Complex Dynamics. 2013, 657-691.
  74. Landau, L.D.; Lifshitz, E.M. Curso de Física Teórica, Física Estadística, Vol. 5, Reverté, México, 1964.
  75. Betancourt-Mar, J.A.; Llanos-Pérez, J.A.; Cocho, G.; Mansilla, R.; Martin, R.; Montero, S.; Nieto-Villar, J.M. Phase transitions in tumor growth: IV relationship between metabolic rate and fractal dimension of human tumor cells. Physica A 2017, 473, 344. [Google Scholar] [CrossRef]
  76. Izquierdo-Kulich, E.; Rebelo, I.; Tejera, E.; Nieto-Villar, J. M. Phase transition in tumor growth: I avascular development. Physica A: Statistical Mechanics and its Applications 2013, 392, 6616–6623. [Google Scholar] [CrossRef]
  77. Llanos-Pérez, J. A.; Betancourt-Mar, A.; De Miguel, M. P.; Izquierdo-Kulich, E.; Royuela-García, M.; Tejera, E.; &amp, *!!! REPLACE !!!*; Nieto-Villar, J. M. Phase transitions in tumor growth: II prostate cancer cell lines. Physica A: Statistical Mechanics and its Applications 2015, 426, 88–92. [Google Scholar] [CrossRef]
  78. Llanos-Pérez, J. A.; Betancourt-Mar, J. A.; Cocho, G.; Mansilla, R.; Nieto-Villar, J. M. Phase transitions in tumor growth: III vascular and metastasis behavior. Physica A: Statistical Mechanics and its Applications 2016, 462, 560–568. [Google Scholar] [CrossRef]
  79. Martin, R. R.; Montero, S.; Silva, E.; Bizzarri, M.; Cocho, G.; Mansilla, R.; Nieto-Villar, J. M. Phase transitions in tumor growth: V what can be expected from cancer glycolytic oscillations? Physica A: Statistical Mechanics and its Applications 2017, 486, 762–771. [Google Scholar] [CrossRef]
  80. Guerra, A.; et al. Phase transitions in tumor growth VI: Epithelial–Mesenchymalmal transition. Physica A: Statistical Mechanics and its Applications 2018, 499, 208–215. [Google Scholar] [CrossRef]
  81. Betancourt-Padron, P. J.; García-Medina, K.; Mansilla, R.; Nieto-Villar, J. M. Phase transition in tumor growth VIII: The spatiotemporal of avascular evolution. Revista Mexicana de Física 2020, 66, 856–862. [Google Scholar] [CrossRef]
  82. Nieto-Villar, J.M.; Mansilla, R. Ferroptosis as a Biological Phase Transition I: avascular and vascular tumor growth. European Journal of Biomedical and Pharmaceutical Sciences 2021, 8, 63–70. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated