Preprint
Article

This version is not peer-reviewed.

Kappa-Frameshift Background Mutations and Long-Range Correlations of the DNA Base Sequences

Submitted:

16 December 2025

Posted:

17 December 2025

You are already at the latest version

Abstract
Following Livadiotis G. and McComas D. J. (2023) [1], we propose a new type of DNA frameshift mutations that occur spontaneously due to information exchange between the DNA sequence of length bases (n) and the mutation sequence of length bases (m), and respect the kappa-addition symbol ⊕κ. We call these proposed mutations Kappa-Frameshift Background (KFB) mutations. We find entropy defects originate in the interdependence of the information length systems (or their interconnectedness, that is, the state in which systems with a significant number of constituents (information length bases) depend on, or are connected with each) by the proposed KFB-mutation). We also quantify the correlation among DNA information length bases (n) and (m) due to information exchange. In the presence of entropy defects, the Landauer’s bound and minimal metabolic rate for a biological system are modified. We observe that the different n and κ scales are manifested in the double evolutionary emergence of the proposed biological system through subsystems correlations. For specific values of the kappa parameter we can expect deterministic laws associated with a single biological polymer in the short term before the polymer explores over time all the possible ways it can exist.
Keywords: 
;  ;  ;  

1. Introduction

Informational entropy analysis is often used in biology, but its meaning should be distinguished from that of thermodynamic entropy in molecular biology and evolution [2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26]. The genetic information in DNA supplies straightforward examples for informational entropy concepts. The characteristic nucleotide sequence of the four bases (Adenine A, Guanine G, Cytosine C, and Thymine T) in an organism’s DNA is the same in all cells of the organism, and results in the combination of random mutations and functional selection during biological evolution. There is no preference for a specific sequence; 4 n different DNA sequences are equally possible, and the actual DNA sequence has only a 4 n probability ( p ) (where ' ' n ' ' is the number of bases in the DNA sequence, ranging from one million in bacteria to billions in many animals and plants). Therefore, the space of possible molecular configurations of molecules within an organism is astronomically large. We may assume that the system is “non-ergodic” in the sense that not every part of the configuration space can be explored [3,27,28,29,30]. There are several major reasons to expect convergence [3]. First, all evolutionary trajectories occur under certain generic selective constraints, such as the laws of mathematics, physics, and chemistry; these should lead to some universal features [31,32]. For instance, a novel quantum system (quantum dots) recently proposed by the author (2024) [33], due to the system’s peculiar quantum state–a Diophantine Markov state, would be trapped in a sequence of self-replicating states and would never reach a classical limit, no matter how big the cube (since gaps between states are always finite). Second, many of these spaces are not explored randomly [34,35,36,37,38,39,40,41,42,43]. Finally, evolutionary trajectories happen within a certain system: Darwinian evolution will act on populations of entities. Entities within the system emerge within assembly spaces [44,45,46]. They are constructed from components over evolutionary timescales. These shape evolution and shift the way one should think about sequence search conditioned on the past, where the logic of assembly spaces may lead to certain types of universal convergence [3,46].
DNA replication can lead to the introduction of small insertions or deletions. The addition or removal of one or more base pairs m leads to insertion or deletion mutations, respectively. As shown in [47], the loss or addition of a single nucleotide causes all of the subsequent three-letter codons to be changed [48].These mutations are called frameshift mutations because the frame of the triplet reading is altered during translation. A frameshift mutation occurs when any number of bases m are added or deleted, except multiples of three which reestablish the initial reading frame.
Kappa-distributions emerge within the framework of statistical mechanics by maximizing the associated kappa-entropy S κ under the constraints of the canonical ensemble (e.g., Livadiotis G. and McComas D. J. (2009, 2014, 2023) [1,49,50]). Kappa-distributions are consistent with thermodynamics; yet this fact alone cannot justify the generation and existence of these distributions. Examples of kappa-distributions include super-statistics [51,52,53,54,55,56], the effect of shock waves [57], turbulence [58,59,60], the effect of pickup ions [61,62,63], the pump acceleration mechanism [64], colloidal particles [65], and polytropic behavior [66,67]. For all these phenomena, there is a unique thermodynamic origin. The existence of correlations among the particles of a system does not permit the usage of the classical framework based on the Boltzmann [68]—Gibbs [69] (BG) entropy and Maxwell–Boltzmann (MB)[70] canonical distribution of particle velocity ( u ). The thermodynamic origin of kappa-distributions has recently been connected to the new concept of “entropy defect”, proposed by Livadiotis G. and McComas D. J. (2021) [71,72]. Recently, Livadiotis and McComas (2023) [1] introduced a specific algebra of the addition rule with entropy defect. The addition rule forms a mathematical Abelian group on the set of any measurable physical-quantity (including entropy), called the kapa-addition and refer to by the symbol κ . The physical meaning of the kappa index can be understood through particle (subsystems) correlations: the kappa index provides a measure of the loss (for κ > 0 ) or gain (for κ < 0 ) of entropy Δ S s y s t e m after the mixing of the partial systems A and B to compose the combined system [1].
Following Livadiotis and McComas (2023) [1], we propose a new type of DNA frameshift mutations that occur spontaneously due to information exchange between the DNA sequence of length bases ( n ) and the mutation sequence of length base (m), and respect the kappa-addition symbol κ . We call these proposed mutations Kappa-Frameshift Background (KFB) Mutations. We find that the origin of an entropy defect is based on the interdependence of the information length systems (or their interconnectedness, that is, the dependence, or connection, of systems with a significant number of constituents (information length bases) on/with each other. The proposed KFB-mutation quantifies the correlations among the information length bases, or constituents of these systems due to information exchanges. In the presence of entropy defects the Landauer’s bound and the minimal metabolic rate for a biological system are modified. We observe that that the different n and κ scales are manifested in the double appearance of the evolution of the proposed biological system through subsystems correlations. In this evolution, for short time scales and for specific values of the kappa-parameter, we can expect deterministic laws associated with a single biological polymer.

2. Kappa-Frameshift Background DNA Mutations

DNA replication can lead to the introduction of small insertions or deletions of base pairs. The addition or removal of one or more base pairs m results in insertion or deletion mutations, respectively. As shown in [47], the loss or addition of a single nucleotide causes all of the subsequent three-letter codons to be changed [48]. These mutations are called frameshift mutations because they alter the frame of the triplet reading during translation. Spontaneous DNA mutations occur suddenly and their origin is unknown. Such mutations, also called “background mutations”, have been reported in many organisms, such as Oenothera, maize, bread molds, microorganisms (bacteria and viruses), Drosophila, mice, humans, etc. (see Ref [48] for details). A frameshift mutation occurs when any number of bases m are added or deleted, except multiples of three which reestablish the initial reading frame. Mutations of multiples of three nucleotides just add one or several reading frames into the sequence without altering the composition of those reading frames thereafter. It is possible, however, that the frameshift causes early termination of the translation (e.g., when a 3-nucleotide stop codon sequence is introduced). The results of frameshift mutations can, therefore, be very severe if the mutations occur early in the coding sequence. Frameshift mutations can occur at microsatellites (tandem repeats of 1–6 base pairs per repeat unit) and also at non-iterated or short repetitive sequences. Proofreading can remove insertion or deletion mutations in short repetitive and non-iterated sequences with almost the same efficiency as that for point mutations; however, it is much less efficient in removing frameshift mutations in homopolymeric runs [47]. These mutations are removed efficiently by mismatch repair systems. The instability of microsatellite sequences is associated with many diseases, including cancer [48].
Following Livadiotis and McComas (2023) [1], we propose a new type of DNA frameshift mutations that occur spontaneously due to information exchanges between the DNA sequence of base lengths ( n ) and the mutation sequence of base lengths (m), and respects the kappa-addition symbol κ . We call the propose mutations Kappa-Frameshift Background (KFB) Mutations. KFB mutations will occur when any number of bases m are added or deleted, to the initial DNA sequence n with respecting kappa-addition symbol κ (except multiples of three, which will reestablish the initial reading frame). In the case of the kappa-addition of any number of bases m to the initial DNA base sequence n , we define the set of kappa-natural numbers , κ as the set of natural numbers that satisfies kappa addition κ law then the proposed KFB mutations are given as follows:
n κ m = n + m n m κ n m no   multiples   of   three ,   and κ *
),
which brings the usual sum as a particular case ⊕ ≡ +. The relation 1 can be expressed in the convenient product form [1]:
1 n κ m κ = 1 n κ 1 m κ , n m no   multiples   of   three ,   and κ *
),
where n is the initial length of information (string length) of the DNA or biological polymer that is being copied and m (no multiples of three) is the length of information (string length) of the mutation added to the initial base n . The κ-sum is commutative (n⊕κm = m⊕κn), associative (n⊕κ(m⊕κz) = (n⊕κm)⊕κz), but it is not distributive in relation to the usual multiplication (α(n⊕κm)≠(αn⊕καm)). The neutral element of the κ-sum is zero, n⊕κ0 = n. Similarly, in the case of κ-deletion of any number of bases m (no multiples of three) to the initial DNA base sequence n , the proposed KFB mutation is given by the following:
n κ m ¯ = n m 1 m κ ,   ( m κ n and   no   multiples   of   three , κ * )
Equation (3) is κ-subtraction, the inverse operation of κ-addition (1): physically, it means that some base(s) are deducted from the initial DNA or biological polymer. The κ-difference obeys n⊕κm̅=⊕κm̅⊕κn and n⊕κ(m̅⊕κz) = (n⊕κm̅)⊕κz, but α(n⊕κm̅) ≠ (αn⊕καm̅). For both κ-addition and κ-subtraction to the initial base n of any bases m which are multiples of three, we obtain the KFB changes of the information length of the DNA or biological polymer being copied. Then the KFB mutation just adds one or several reading frames into the sequence without altering the composition of those reading frames thereafter.
The meaning of the kappa index in the proposed KFB mutation can be understood through the subsystem (subbases) of correlations between n and m due to information exchanges within a given length of information (string length) in the DNA or biological polymer being copied. In fact, a simple relation exists between the Pearson correlation coefficient R and the kappa index κ :
R κ = d κ / 2 ,   where   κ *
for subbases with d κ -degrees of freedom ([71,72,73];[74], Chapter 5).
The largest value of kappa, κ , corresponds to DNA or biological polymer characterized by the absence of any correlations between subbases of n , m
n m = n + m n m no multiples of three)
This condition is the frameshift mutation. The smallest possible kappa value, κ min = d κ / 2 , corresponds to the state furthest from the ordinary frameshift mutation of a DNA or biological polymer, and is characterized by the strongest correlation between n and m due to information exchange. The parameter κ provides a measure of the loss ( κ > 0 ) or gain ( κ < 0 ) of information after the mixing of the partial bases n and m to compose the combined finial DNA or biological polymer.

3. Information Defects in the DNA or Biological Polymer

Following Ricard Solé et al. (2024) [3], the size of sequence space Ω ( n ) associated with a biological polymer of length n , built from a molecular alphabet Σ = { A 1 , A 2 A 3 , ... , A N } of size | Σ | = N (where N ), is:
Ω ( n ) = | Σ | n = N n .
The space of possible proteins with a length base n of 1000 amino acids is Ω = 20 1000 – a space so large that it could not be explored [27]. The space of possible molecular configurations within an organism is astronomically large. The probability ( p ) for such a such a protein to exist in nature is p ( Ω ) = 20 1000 . We may assume that the biological system the is “non-ergodic” in the sense that not every part of the configuration space can be explored [28,29,30]. (In statistical physics, an ergodic system is one which, over sufficiently long time-scales, explores all possible microstates that are consistent with its macroscopic properties.) Mathematically, for any physically observable Q ( q ) , where q is the microstate that specifies all of the system’s coordinates ( q = ( q 1 , q 2 , q 3 , .. ) ) , the long-time average of Q ( q ) converges to the ensemble average, so
O = lim τ 1 τ 0 τ O ( q ( t ) ) d t .
In intuitive terms, an ergodic system explores (over time) all the possible ways it can exist. Given enough time, the system will visit all its available configurations or states [3].There are several major reasons to expect convergence [3]. Firstly, all evolutionary trajectories occur under certain generic selective constraints, such as the laws of mathematics, physics, and chemistry. These constrains should lead to universal features [31,32]. Secondly, many of these states are not explored randomly. Finally, evolutionary trajectories happen within a certain system: Darwinian evolution acts on populations of entities, which emerge within the system in assembly spaces [44,45, and 46].
The second law of thermodynamics states that a physical process must generate an overall increase Δ S t o t a l in the entropy of a system ( Δ S s y s t e m ) and its environment ( Δ S e n v ):
Δ S t o t a l = Δ S s y s t e n + Δ S e n v .
Therefore, living systems can only perform processes that reduce entropy internally if, at the same time, they produce an even greater increase in entropy in the environment. Thus, the universal thermodynamic logic of life is that low-entropy input (“resource”) is turned into high-entropy output (“waste”), which can be expressed as:
Δ S s y s t e m < 0 ,   Δ S e n v > 0 ,   | Δ S e n v | > | Δ S s y s t e m |
Where
Δ S e n v = Q / ( k B T ) ,
is the entropy production in the environment , Q is the amount of heat released, T is the thermal reservoir at temperature k B = 1.38   × 10 23 J / K Boltzmann’s constant.
This suggests that all life forms face the problem of maintaining low internal entropy in a race against the second law of thermodynamics.
In non-equilibrium thermodynamics [3,75], the rate of entropy production per unit time τ can be expressed as an integral:
Δ S t o t a l / τ = V σ d V ,
for a system of volume V , with the specific entropy production rate, σ , written as follows:
σ = k = 1 n X k J k .
In Equation (12), { X 1 , X 2 , X 3 , ... , X n } indicates a set of “forces" (such as temperature gradients or chemical affinities) and { J 1 , J 2 , J 3 , ... , J n } is the set of conjugate fluxes (e.g., het flows, reaction rates). According to Shannon’s formula [2,22,23,24], the evolution of the base sequences of modern organisms from an unspecified base sequence resulted in a gain of information (a decrease in entropy: negative Δ S s y s t e m ), as follows:
Δ S s y s t e m ( n ) = S f ( n ) S i ( n ) .
For the case of writing a specific string from a set of unordered letters ( n = 0 ), we have:
S f ( n ) = ln ( N n ) = n ln N = 0 ( if   and   only   if n = 0 )
Then
Δ S s y s t e m ( n ) = S i ( n ) = ln ( N n ) = n ln N ,
N = | Σ | , (where N ) is the number of unique elements, or letters, in the informational system and n is the length of information (the string length), that is being copied.
Let us now examine the possible evolutionary events resulting from the combination of the proposed KFB mutations and functional selection during biological evolution in the presence of information exchanges between the length bases. This examination involves the following steps: Firstly, we introduce the kappa-multiplication ( κ ) acting on the spaces of possible proteins Ω ( n ) and Ω ( m ) with a length base of n and m , respectively (Equation (6)), and resulting in the kappa-addition κ of the bases n and m (in the similar way that the κ-addition κ ( n , m ) results with the help of the usual addition + ( n , m ) in the Equation (1); namely:
Preprints 190064 i001
Equation (16α) shows the proposed KFB mutation space of the two possible proteins, with Σ being a molecular alphabet of size | Σ | = N , (where N ). For instance, the Equation (16α) is the standard morphism from , κ to * , κ . The morphism (16α) is defined with the help of the morphisms: Ω ( n ) × Ω ( m ) = Ω ( n + m ) and Ω ( n ) m / κ from , + to * , × .
We note that the standard morphism (16α) brings the standard morphism Ω ( n ) × Ω ( m ) = Ω ( n + m ) as a particular case ⊙ ≡ × and ⊕ ≡ + for (κ→∞). Through reciprocity, the logarithm function is the standard morphisms morphism from * , κ to , κ ;namely
Preprints 190064 i002
which is brings the standard morphism log | Σ | ( Ω ( n ) × Ω ( m ) ) = log | Σ | Ω ( n ) + log | Σ | ( m ) from * , × to , + as a particular case ⊙ ≡ × and ⊕ ≡ + for (κ→∞). We impose the condition:
n κ m = n + m n m κ 0 n m no   multiples   of   three ,   and κ * )
This implies the following restriction to the values of the κ-parameter:
1 κ n + m n m ,   κ *
Using Equation (4), we obtain the following restriction to the values of the correlation coefficient R :
R = d κ 2 κ d κ ( n + m ) 2 n m .
for subbases with d κ -degrees of freedom ([71,72,73];[74],Chapter 5). The largest value of kappa, κ , corresponds to DNA or biological polymer without any correlation between the subbases of information length n and m given by the Equation (5). The space of the two possible proteins (Equation (16α)) then becomes:
Ω ( n ) Ω ( m ) = Ω ( n ) × Ω ( m ) × 1 = | Σ | n m = | Σ | n + m ,   ( n , m ( , κ ) , κ * )
This is the frameshift mutation space of the two possible proteins. We may write Equation (16a) as follows:
Ω ( n ) κ Ω ( m ) = Ω ( n ) × Ω ( m ) × Ω ( n , m ) ( c o r e l ) = | Σ | n κ m , ( n , m ( , κ ) , κ * )
In Equation (21), the correlation space of the two possible proteins, Ω ( n ) and Ω ( m ) (with a length base n and m , respectively) due to information exchanges is given by:
Ω ( n , m ) ( c o r r e l ) = Ω ( n ) m / κ = | Σ | n m κ , ( n , m ( , κ ) , κ * )
The κ-product (16α) is commutative Ω(n)⊙κΩ(m)=Ω(n)⊙κΩ(m)) and associative (Ω(n) ⊙κ(Ω(m)⊙κΩ(z))=(Ω(n)⊙κΩ(m))⊙κΩ(z)). The number one is the neutral element of the κ-product (Ω(n)⊙κ1=Ω(n), and this permits us to define the inverse multiplicative,1⊙κ Ω(n̅), by means of Ω(n)⊙κ(1⊙κΩ(n̅)) ≡ 1.
Similarly, the space of the two possible proteins, Ω ( n ) and Ω ( m ) in the case of the deletion of any number of bases m (except multiples of three) from the initial DNA sequence of base n is:
Ω ( n ) Ω ( m ¯ ) = | Σ | ( n m ) / 1 m / κ ) = | Σ | n κ m ¯ .   ( n , m ( , κ ) ,   κ * )
Using Equations (13)–(15), the kappa-gain of information (a decrease in kappa-entropy: negative Δ S κ s y s t e m ) of the system (DNA or biological polymer) during the proposed KFB mutations is given by:
Δ S κ s y s t e m ( n , m ) = ln ( | Σ | n κ m ) = n κ m ln ( | Σ | ) = n + m n m κ ln ( | Σ | ) = S ( n ) + S ( m ) + Δ S D e f n , m .
Δ S κ s y s t e m (n,m) is the entropy of a composite system (DNA or biological polymer) with correlations between n and m ,where:
S ( n ) = n ln ( | Σ | ) ,
S ( m ) = m ln ( | Σ | ) ,
Δ S D e f ( n , m ) = n m κ ln ( | Σ | ) .
Equation (27) is the proposed entropy defect for the DNA or biological polymer.
The characteristic non-additive rule of the kappa entropy [1] was first derived on a physical basis through the concept of entropy defect [1]. Then, it was demonstrated that the entropy defect can lead to the generalized, kappa-associated entropy [1].Here, we explain the origin of entropy defects based on the interdependence of information length systems (or interconnectedness of these systems: the condition whereby systems with a significant number of constituents (information length bases) dependent on, and/or are connected to, each other). The proposed KFB mutation quantifies the correlations among the information length of the bases (DNA or polymer constituents) due to information exchanges. Interdependence induces order and reduces the entropy-information in DNA or biological polymer systems thus producing entropy-information defects. Interdependence is measured through the magnitude 1 / κ of the entropy-information defect, defend as:
1 κ = 1 Δ S κ s y s t e m ( n , m ) S D e f ( n , m ) S s y s t e m ( n , m ) ,
where Δ S κ s y s t e m ( n , m ) (see Equation (24)) is the entropy-information of the system with correlations among its constituents (e.g., information length bases), while
Δ S s y s t e m ( n , m ) = n + m ln | Σ | ,   for   ( Δ S D e f ( n , m ) 0 hence κ )
denotes the entropy-information of this system if there was no correlation among the information length bases. The smallest possible kappa value, κ = d κ / 2 , corresponds to the state furthest from the ordinary frameshift mutation of a DNA or biological polymer, characterized by the highest correlations between bases due to information exchange. The parameter κ provides a measure of the loss ( κ > 0 ) or gain ( κ < 0 ) of information after the mixing of the partial bases n and m to compose the combined final DNA or biological polymer.

4. Properties of the Information Defect in DNA or Biological Polymers

The information defect determines how the information of the DNA or biological polymer partitions into the entropy information of its constituents. The participation of each constituent’s entropy-information in the information defect must be (1) separable, (2) symmetric, and (3) (upper- or lower-) bounded for information loss ( κ > 0 ) or gain ( κ < 0 ). Below, we analyze the physical meaning and mathematical formulation of the three properties that constitute the foundations of information defect:
(1) Separable. The entropy-information defect relates to the constituent’s entropies (informations) through a separable function:
S D e f ( n , m ) S D e f ( n , m = c o n s t . ) × S D e f ( n = c o n s t . , m )
(for details see Ref [1])
(2) Symmetric. The information of the total system is symmetric to the n and m bases: S s y s ( n , m ) = S s y s ( m , n ) . The same holds for the information defect:
S D e f ( n , m ) = S D e f ( m , n )
(for details see Ref [1])
(3) Bounded. This means that the entropic information has an upper ( S s max ) or lower ( s min S ) for information loss ( κ > 0 ) or gain ( κ < 0 ).
We note that the upper limit tends to infinity in the classical thermal equilibrium: S s max = κ or κ = s min S for information loss ( κ > 0 ) or gain ( κ < 0 ) respectively. This indicates that, in the classical case, entropy changes can constantly increase or decrease tending to infinity, and eventually can attain any possible positive or negative values for information loss ( κ > 0 ) or gain ( κ < 0 ), respectively [1].

5. Kappa Metabolic Rate in the Presence of Information Defects

Landauer’s bound [76] gives us the minimal energy required to perform an abstract computation and many string-writing operations, including the copying and transformation of DNA or biological polymer [77]. We may ask if the available energy is sufficient for the string copying and processing of a very small amount of stored information. The fundamental bounds of information can be connected to metabolism via Landauer’s bound by calculating the minimal metabolic rate, W, needed to replicate genetic information for a given genome length and cellular growth rate [3,76,78]. In the presence of entropy defects, Landauer’s bound is modifies due to the KFB mutations, as follows:
Q κ k B T ( S κ I S κ F ) ,
where Q κ is the kappa heat released to a bath at temperature T , k B is Boltzmann’s constant, S κ I and S κ F are the initial and final kappa system entropy, respectively. This is nothing more than the expression of the second law of thermodynamics (Equations (13–15)), where Δ S κ s y s t e m = S κ I S κ F . To write a specific string from a set of unordered letters ( n = 0 ), we have S κ F = 0 and
S κ I n , m = ln ( | Σ | n κ m ) = S I ( n ) + S I ( m ) Δ S I D e f n , m .
In Equation (33), terms S I ( n ) , S I ( m ) and Δ S I D e f n , m are calculated from Equations (25–27); | Σ | = N ,the number of unique elements, or letters, in the informational system (Equation(6)); n is the length of information (the string length) that is being copied; and m is the length of information (the string length) of the KFB mutation. Here, n κ m is the kappa-composed length of information (the composed string length) after the action of the KFB mutation (Equation (1)). Using Equations (25–27) and (33), the kappa-modified Landauer’s bound becomes:
Q κ Q Δ Q D e f ,
where
Q = k B T S I ( n , m ) = k B T ( n + m ) ln ( | Σ | )
is the heat released to a bath at temperature T independent of the kappa index, and
Δ Q D e f = k B T Δ S I D e f ( n , m ) = k B T n m κ ( ln | Σ | )
is the heat released to a bath at temperature T dependent of the kappa index and corresponding to the entropy-information defects of the DNA or biological polymer. The term n m / κ in Equation (36) is the correlation between the information lengths n and m due to information exchange after the action of the KFB mutation.
We can convert the kappa-modified Landauer’s bound into a kappa-modified metabolic rate W κ , due to the presence of entropy defects by considering how fast the information is copied. Given division time, t d i v , of seconds, the kappa-modified minimal metabolic rate of copying is:
W κ = W Δ W D e f = k B T ( n κ m ) ln ( | Σ | ) t d i v = k B T ln ( | Σ | ) t d i v n + m n m κ ,
where
W = k B T ( n + m ) ln ( | Σ | ) t d i v
is the metabolic rate independent of the kappa index, and
Δ W d e f = k B T n m ln ( | Σ | ) κ t d i v
is the metabolic rate dependent of the kappa index and corresponding to the entropy-information defects of the DNA or a biological polymer. Again, in limit (29), Equations (34) and (37) become
Q s y s t e m ( n , m ) k B T n + m ln | Σ | ,   for   ( Δ Q D e f ( n , m ) 0 hence κ )
W s y s t e m ( n , m ) = k B T n + m ln | Σ | t d i v ,   for   ( Δ W D e f ( n , m ) 0 hence κ )
Equation (40) denotes Landauer’s bound whereas Equation 41 shows the minimal metabolic rate for a biological system if there are absolutely no correlations among the information length bases.

6. Kappa Distribution of DNA Length Bases or Biological Polymers

Starting from the connection of probability distribution p with ( κ )-entropy, and following Gibb’s path, we maximize the BG entropy S = p ln p under the constraint of normalization p ( Ω ) = 1 , and derive the Maxwell–Boltzmann (MB) probability distribution p ( Ω ) = | Σ | n . In the latter, Ω ( n ) = | Σ | n and N = | Σ | is the number of letters in the molecular alphabet. These steps can be reversed: the maximization of entropy, S = f ( p ) , leads to the distribution, f ( p ) = n + c o n s t ; compared to the MB distribution ,we find f ( p ) = ln 1 / p + c o n s t , leading to f ( p ) = p ln p (taking also into account that in the case of one single possibility, i.e., p = 1 , we have S = 0 ) – that is, the BG entropy. From this we derive the connection of entropy with ( κ )-entropic addition rule: Starting from the BG entropy applied to the two DNA systems, n and m , and their composite system, n + m , and noting the respective distributions p and entropies S ( n ) and S ( m ) , we apply, the entropic equation, S = p ln p , and the property of statistical independence:
p ( n + m ) = p ( n ) p ( m ) or   ln p ( n + m ) = ln p ( n ) + ln p ( m )
The latter is deduced from the exponential distribution function:
p ( n + m ) = 4 ( n + m ) = 4 n × 4 m = p ( n ) p ( m ) ,
that maximizes this entropy and the energy summation of the DNA length bases n + m . Then, we obtain
S ( n + m ) = p ( n ) ln p ( n ) p ( m ) ln p ( m ) = S n ( n ) + S m ( m ) .
, are independent – a characteristic of the MB exponential distribution function. The associated entropy of each of these systems is given by the BG formulation.
Connection of the probability distribution with ( κ < )-entropy
Following Gibb’s path, we maximize the kappa entropy
S κ p = ln κ p ( Ω ) = κ 1 p 1 + 1 κ ( Ω ) = κ 1 p κ ( Ω )
under the constraint of normalization p ( Ω ) = 1 and derive the kappa-probability distribution:
p κ ( Ω ) = | Σ | n 1 + 1 κ = | Σ | n 1 + 1 κ = p 1 + 1 κ ( Ω ) ,
where p ( Ω ) = | Σ | n , Ω ( n ) = | Σ | n ,and
Ω κ ( n ) = | Σ | n 1 + 1 κ = | Σ | n 1 + 1 κ = Ω 1 + 1 κ ( n ) .
The expectation values are determined through the probability p 1 + 1 κ ( n ) / p 1 + 1 κ ( n ) , called “escort” [4,48]. In Equation (45), given that S = 0 for p = 1 . This is the kappa-related entropy, also named after Havrda, Charvát, Daróczy and Tsallis [79,80,81]. In Equation (45), we use the κ-deformed exponential function and its inverse, the deformed logarithm function (Equations [47,48,49,50]), to get:
exp κ ( x ) 1 x κ κ , ln κ ( x ) κ 1 x 1 κ
with the properties of the κ-logarithm and the κ-exponential in a compact form:
Preprints 190064 i003
This leads us to the definition of the κ-product (⊗κ) between two real numbers x, y:
Preprints 190064 i004
The product (49ς) is commutative (x⊗κy = y⊗κx) and associative (x⊗κ(y⊗κz)=(x⊗κy)⊗κz), provided x⊗κy and y⊗κz differ from zero and from infinity. The number one is the neutral element of the κ-product (49ς) (x⊗κ1 = x), and this permits us to define the inverse multiplicative (⊘κ), 1⊘κx, by means of x⊗κ(1⊘κx) ≡ 1[82].
Using the κ-logarithm and κ-exponential functions the two standard morphisms Equations (16α-β) have split into four ones. Let us denote the definition set of the generalized exponential function exp κ ( x ) by D κ [82]. From Equation (49β), it comes that the generalized exponential function is morphism from D κ , κ to + * , + . But from Equation (49γ), we note that this function is also morphism from D κ , + to + * , κ . From Equation (49δ), it comes that the generalized logarithm function is morphism from + * , κ to D κ , + . But from Equation (49ε), we note that this function is also morphism from + * , × to D κ , κ [82]. Here, we use the kappa index as a subscript to denote the deformed functions, but please note that the q-index could have been used as well (see also: Ref [73] Appendix A for more details). We also recall the known notation (49α) of deformed exponential/logarithm functions:
Ω ( n ) = exp κ ( S κ ( n ) ) = exp ( S ( n ) ) ,
S κ ( n ) = ln κ exp ( S ) S ( n ) = ln exp κ ( S κ ( n ) ) ,
which constitutes the multiplicity Ω ( n ) . This multiplicity (or statistical weight) is the space of possible molecular configurations of molecules within that biological system. We observe that the multiplicity Ω ( n ) = exp κ S κ ( n ) becomes a thermodynamic quantity independent of the kappa index. In particular, given the probability distribution p , the statistical definition of this entropy, is formulated by:
S κ p = κ 1 p 1 + 1 κ ( Ω ) = κ 1 p q ( Ω ) / q 1 ,
expressed in terms of the kappa κ-parameter (mostly used by the space-science community, or equivalently, the q-index (mostly used by the community of no-extensive statistical mechanics [74]):
q = 1 + 1 κ κ = 1 q 1 .
The BG entropy for κ or q 1 is:
S = p ln p .
Connection of ( κ < )-entropy with a generalized addition rule
Starting from Equation (45), we calculate the information function
1 S ( n ) κ = p ( n ) 1 + 1 κ ,
where
κ = κ ln | Σ | ,
is the kappa index transformation κ κ ˜ , and | Σ | = N is the number of letters in the molecular alphabet. Applied again to the two independent systems of the bases n and m and their kappa-composite system, base n κ m , we find:
( p ( n κ m ) ) 1 + 1 κ = ( p ( n ) ) 1 + 1 κ ( p ( m ) ) 1 + 1 κ ,
or
1 S ( n ) κ S ( m ) κ = 1 S ( n ) κ 1 S ( m ) κ .
Equations (57) and (58) lead to the standard form of entropy-information defects for a DNA or biological polymer, as in Ref [1]:
S ( n ) κ S ( m ) = S ( n ) + S ( m ) Δ S D e f ( n , m )
where
S ( n ) = n ln ( | Σ | ) , S ( m ) = m ln ( | Σ | ) ,
and
Δ S D e f ( n , m ) = S ( n ) S ( m ) κ ˜ = n m κ ˜ ln ( | Σ | ) 2 .
Then, after the kappa-index transformation κ κ ˜ :
κ ˜ = κ ln | Σ | κ = κ ˜ ln | Σ | ,
in Equation (61), the entropy defects in the standard form (Refs [1]) can takes the form of the proposed entropy defects (Equation (27)). Similarly, by applying the kappa-index transformation κ ˜ κ in Equation (27), the entropy defects take the standard form (Equation (61)), as in Refs [1]. The reverse process can be found by starting from this generalized addition rule and seeking the entropic function that obeys this rule; this has been already shown in several earlier publications (e.g. [1]).

7. Discussion

Following Ref [3], the required metabolic rate for information copying alone is calculated from Equation (6), with the key parameters being the number of letters in the molecular alphabet N = | Σ | , the length of the composite genome ( n κ m ) after the kappa frameshift mutation, (with n and m being the length of the genome and the mutation length, respectively), and the time to copy the information ( t d e v ). For N = | Σ | = 2 2 = 4 letters, at a typical division time t d i v = 10 3 s and a Typical Genome Length (TGL) of n 5 × 10 6 b a s e s , Equation (6) gives an ordinary Typical Bacterial Metabolic Rate (TBM) of W ( T B M ) 10 12 J / s
as follows:
W κ ( T B M ) = W ( T B M ) 1 m κ + W K F B
where
W K F B = k B T m t d e v ln ( | Σ | ) .
The smallest possible kappa value, κ min = d κ / 2 , ( d κ : degrees of freedom ([71,72];[74], Chapter 5) corresponds to the state furthest from the ordinary frameshift mutation of a DNA or biological polymer, characterized by the highest correlation between bases. Using the κmin value, Equation (63) becomes:
W κ ( T B M ) = W ( T B M ) 1 2 m d κ + W K F B .
In the MB probability distribution p ( Ω ) = | Σ | n for a DNA or biological polymer, entropy is zero S ( Ω ) = 0 only when a specific information string is written from a set of unordered letters, ( n = 0 ). The probability value for such an unordered information string is p ( Ω ) = 1 . Furthermore, in the MB statistical mechanics for a specific information string from a set of ordered letters ( n > 0 ), there is no preference for a specific sequence: if Ω ( n ) = 4 n , different DNA sequences are equally possible and the actual DNA sequence has only a probability p ( Ω ) = 4 n (where n is the number of bases (ordered letters) in the DNA sequence, ranging from one million in bacteria to billions in many animals and plants). For example, the space of possible proteins Ω ( n ) with a base length n of 1000 amino acids is Ω ( 1000 ) = 20 1000 – a space so large that it could never be explored in our universe [27]. The space of possible molecular configurations within an organism is astronomically larger. The probability p of such a protein (ordered information string) to exist in nature is p ( Ω ) = 20 1000 . The requirements of longer time and a larger number of individuals are even more conducive to the observation of deterministic laws associated with the base length n . The evolutionary fitness of organisms relates to the progressive reduction of the entropy production rate. This, and the Boltzmann’s ergodic principle, allow us to consider organisms as evolving chemical entities (Sabater 2006, 2009[2,4]). Accordingly, the time-average state of a single individual over a long period (evolutionary time) coincides with the average state of a sufficiently large number of individuals for shorter time periods [5]. We note that Sabater (2009[5]) worked with an alternative theory with biological time arrows, showing that deterministic behavior of a biological system can arise.
Let us now examine the evolutionary events that could result from the combination of functional selection during biological evolution and information exchanges between the base lengths of biological polymers, represented by the kappa parameter. For DNA or biological polymer with the kappa-probability distribution (Equation (46)) derived from the maximized kappa entropy (Equation (45)) under the constraint of normalized p κ ( n ) = 1 , we observe that the different n and κ scales in Equations (45) and (46) are manifested in the double evolutionary emergence of the proposed biological polymers through correlations between its bases. For specific values of the kappa parameter, deterministic laws associated with a single biological polymer can be expected for short timescales. Thus, for the kappa-probability distribution, we have the following short-timescale possibilities for an individual:
p κ ( Ω ) = p 1 + 1 κ ( Ω ) = | Σ | n 1 + 1 κ = 1 , κ = 1 , n * | 0 | Σ | 2 n < 1 , κ = 1 , n * .
Equation (66) measures the loss (for κ = 1 ) or gain (for κ = 1 ) of information. In Equation (66), for p ( Ω ) = | Σ | n 1 and non-zero integer (ordered letters n > 0 ), it is p κ = 1 ( Ω ) = 1 . Therefore, for kappa-entropy, we find the following short-timescale possibilities for a single biological polymer:
S κ ( Ω ) = κ 1 p 1 + 1 κ ( Ω ) = 0 , κ = 1 , n * 0 < 1 p 2 ( Ω ) 1 , κ = 1 , n * .
In Equation (67), for p ( Ω ) = | Σ | n 1 , p κ = 1 ( Ω ) = 1 , and non-zero integer (ordered letters n > 0 ), it is S κ = 1 ( Ω ) = 0 .
Let us consider the example of the state characterized by κ = 1 , that is, the strongest correlation between bases due to information exchange. The space of possible final proteins Ω κ = 1 ( F ) ( n ) with a length base n of 1000 amino acids is:
Ω κ = 1 ( F ) ( 1000 ) = Ω κ = ( I ) ( 1000 ) p κ = ( I ) ( 1000 ) = p κ = ( I ) ( 1000 ) Ω κ = ( I ) ( 1000 ) = p κ = 1 ( F ) ( 1000 ) = 1 .
The final protein entropy S κ = 1 ( F ) ( 1000 ) = 0 b i t . From this, we obtain entropy changes (gain of information) of the following value:
Δ S κ s y s t e m ( 1000 ) = S κ = ( I ) ( 1000 ) = 1301 , 029 bit ,
with Δ n = n F n I = 0 for equal final and initial base length of amino acids ( n F = n I = 1000 ). We call these proposed changes iso-order n changes. These changes are characterized by the parameters:
Preprints 190064 i005
The space of possible final proteins Ω κ = 1 ( F ) ( n ) with a base length n of 1000 amino acids is 1. An initial space of a magnitude as large as Ω κ = ( I ) ( 1000 ) 20 1000 has thus now collapsed to the one final possibility: Ω κ = 1 ( F ) ( 1000 ) = 1 . The probability p for such a protein (ordered information string) final state to exist in nature is now one p κ = 1 ( F ) ( 1000 ) = 1 .
Let us now consider another example for ( κ = 1 ) state characterized by the highest correlations between bases due to information exchanges. By writing a specific initial information string from a set of unordered letters for kappa values ( n = 0 , κ * \ 1 ) we have:
S κ I ( n ) = n 1 + 1 κ ln ( 20 ) = 0 b i t ,   ( if   and   only   if , n = 0 , κ * \ 1 )
We then obtain the entropy changes (information gain):
Δ S s y s t e m ( n ) = S κ = 1 ( F ) ( 1000 ) S κ ( I ) ( 0 ) = 0 b i t ,
where S κ = 1 ( F ) ( 1000 ) = 0 b i t is the final entropy of the “mathematical” protein Equation (72) is the iso-entropic process for a given “mathematical’’ protein, and is adiabatic and reversible [83,84,85,86], with equal initial and final probability (that is, p κ ( I ) ( 0 ) = p κ = 1 ( F ) ( 1000 ) = 1 ).
Furthermore, we have the order n changes: Δ n = n F n I = n F = 1000 . It seems that order n F emerges spontaneously from the unordered initial information string. This emergence of order n F may relate to the quantum self-organization process (see Refs [33,87,88,89,90] for details).
Using Equations (37) and (41) the iso-entropic changes for the proposed “mathematical’’ protein yield to the zero kappa-metabolic rate. This implies that, at this optimum state ( κ = 1 ), to protect itself from damage it cannot endure, the “mathematical” protein must not waste energy (produce entropy) in reactions. We call this state of the “mathematical” protein the kappa-functional optimum state. This state is given by Equation (72). In nature, it may be possible for a real protein to approximate the kappa-functional optimum state with increasing correlation between its parts:
( n I , κ ) , ( n F , κ ) a p p r o x 0 , κ , 1000 , 1 .
Due to the strongest correlation between its parts, such a protein would need to waste much less energy (produce less entropy) in reactions and would be less susceptible to damage than the classical information theory predicts.

8. Conclusion

Following Livadiotis G. and McComas D. J, (2023) [1], we propose a new type of DNA frameshift mutations: Kappa-Frameshift Background (KFB) mutations. KFB mutations occur spontaneously due to information exchange between the DNA sequence of length base ( n ) and the mutation sequence of length base (m), and respect the kappa-addition symbol κ . KFB mutations occur when any number of m bases are added or deleted to the initial DNA sequence of n bases. This repeats the kappa-addition symbol κ , except in the case of multiples of three, which reestablish the initial reading frame. In the proposed KFB mutation, the kappa index can be understood through the subsystem (subbase) of information length correlation between n and m due to information exchange within the given length of DNA or biological polymer (the string length) that is being copied.
We find that entropy defects originate in the interdependence of information length systems (or their interconnectedness: the state at which systems with a significant number of constituents (information length bases) depend on, or are connected with, each other) by the proposed KFB mutations. In general, interdependence induces order in the systems involved (DNA or biological polymers) and reduces their entropy-information, thus producing entropy-information defect.
In the presence of entropy defects, the Landauer’s bound and minimal metabolic rate for a biological system is modified. We observe that that the different n and κ scales in Equations (45) and (46) are manifested in the double evolutionary emergence of proposed biological polymers due to subsystems correlations. In this process, for specific values of the kappa parameter, we can expect deterministic laws associated with a single biological polymer for short timescales. We find that the possible evolutionary events resulting from the combined effects of functional selection during biological evolution and information exchange between the length bases of the biological polymer are represented by the kappa parameter. We also identify iso-order n changes characterized by the parameters (70). Using a proposed “mathematical” protein, we find that the iso-entropic changes this protein yield to the zero kappa-metabolic rate. This implies that, at the optimum state ( κ = 1 ), to protect itself from damage it cannot endure, the “mathematical” protein must not waste energy (produce entropy) in reactions. We call this state of the “mathematical” protein the kappa-functional optimum state (Equation (72)). In nature, with increasing correlation between its parts, a real protein may approximate the kappa-functional optimum state. Due to the strongest correlation between its parts, such a protein would waste much less energy (produce less entropy) in reactions, and would be less susceptible to damage, than the classical information theory predicts.

Data Availability Statement

Data will be made available on reasonable request. [Author’s comment: The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.]
Code Availability Statement: This manuscript has no associated code/software. [Author’s comment: Code/Software sharing not applicable to this article as no code/software was generated or analyzed during the current study.]
Declarations
Competing Interests: The authors declare no competing interests.

References

  1. George Livadiotis and David J. McComas 2023 EPL 144 21001.
  2. B.Sabater, Entropy Perspectives of Molecular and Evolutionary Biology. Int. J. Mol. Sci. 2022, 23, 4098. [CrossRef]
  3. Solé, R.; Kempes, C.P.; Corominas-Murtra, B.; De Domenico, M.; Kolchinsky, A.; Lachmann, M.; Libby, E.; Saavedra, S.; Smith, E.; Wolpert, D. Fundamental Constraints to the Logic of Living Systems. Interface Focus 2024, 14. [Google Scholar] [CrossRef]
  4. Sabater, B. Are organisms committed to lower their rates of entropy production? Possible relevance to evolution of the Prigogine theorem and the ergodic hypothesis. Biosystems 2006, 83, 10–17. [Google Scholar] [CrossRef] [PubMed]
  5. B.Sabater, Time arrows and determinism in Biology. Biological Theory 2009, 4(2), 174–182. [CrossRef]
  6. Martyushev, L.M. Entropy and entropy production: Old Misconceptions and new breakthroughs. Entropy 2013, 15, 1152–1170. [Google Scholar] [CrossRef]
  7. 7. Baez, J.C. Blake, S.; Pollard, B.S. Relative Entropy in Biological Systems. Entropy 2016, 18, 46. [CrossRef]
  8. Roach, T.N.F. Use and abuse of entropy in Biology: A case for Caliber. Entropy 2020, 22, 1335. [Google Scholar] [CrossRef]
  9. Demetrius, L. Directionality principles in thermodynamics and evolution. Proc. Natl. Acad. Sci. USA 1997, 94, 3491–3498. [Google Scholar] [CrossRef]
  10. Skene, K.R. Life’s a gas: A thermodynamic theory of biological evolution. Entropy 2015, 17, 5522–5548. [Google Scholar] [CrossRef]
  11. Skene, K.R. Thermodynamics, ecology and evolutionary biology: A bridge over troubled water or common ground? Acta Oecol 2017, 85, 116–125. [Google Scholar] [CrossRef]
  12. Doyle, S.R.; Carusela, F. Guala, S. Momo, F. A null model for testing thermodynamic optimization in ecological systems. arXiv 2019, arXiv:1111.2019v1.
  13. Dewar, R.C. Maximum entropy production and plant optimization theories. Phil. Trans. R. Soc. B 2010, 365, 1429–1435. [Google Scholar] [CrossRef] [PubMed]
  14. Kondepudi, D.K.De Bari, B.Dixon, J.A. Dissipative structures, organisms and evolution. Entropy 2020, 22, 1305. [CrossRef]
  15. Dobovišek, A.; Županovi´c, P.; Brumen, M.; Bonaˇci´c-Loši´c, Z. Kui´c, D. Juretic, D.Enzyme kinetics and the maximum entropy production principle. Biophys. Chem. 2011, 154, 49–55. [Google Scholar] [CrossRef]
  16. Gaiseanu, F. What is life: An informational model of the living structures. Biochem. Mol. Biol. 2020, 5, 18–28. [Google Scholar] [CrossRef]
  17. Pulselli, R.M. Simoncini, E.Tiezzi, E. Self-organization in dissipative structures: A thermodynamic theory for the emergence of prebiotic cells and their epigenetic evolution. Biosystems 2009, 96, 237–241. [CrossRef]
  18. Annila, A.Kuismanen, E. Natural hierarchy emerges from energy dispersal. Biosystems 2009, 95, 227–233. [CrossRef]
  19. Hui, D.Liao-Fu, L.Lin Hao, L. Entropy production rate changes in lysogeny/lysis switch regulation of bacteriophage Lambda. Commun. Theor. Phys. 2011, 55, 371. [CrossRef]
  20. Trevors, J.T.Saier, M.H., Jr. Thermodynamic perspectives on genetic instructions, the laws of biology and diseased states. Comptes Rendus Biol. 2011, 334, 1–5. [CrossRef]
  21. Tretiakov, K.V. Szleifer, I.Grzybowski, B.A. The rate of energy dissipation determines probabilities of non-equilibrium assemblies. Angew. Chem. Int. Ed. Engl. 2013, 52, 10304–10308. [CrossRef]
  22. Shannon, C.E. A mathematical theory of communication. Bell Syst. Technol. J. 1948, 27, 623–656. [Google Scholar] [CrossRef]
  23. Adami, C. Ofria, C.Collier, T.C. Evolution of biological complexity. Proc. Natl. Acad. Sci. USA 2000, 97, 4463–4468. [CrossRef]
  24. Vingaa, S.Almeida, J.S. Rényi continuous entropy of DNA sequences. J. Theor. Biol. 2004, 231, 377–388. [CrossRef]
  25. Demetrius, L. Thermodynamics and evolution. J. Theor. Biol. 2000, 206, 1–16. [Google Scholar] [CrossRef]
  26. Rakoczy, R. Kordas, M. Story, G.Konopacki, M. The characterization of the residence time distribution in a magnetic mixer by means of the information entropy. Chem. Engin. Sci. 2014, 105, 191–197. [CrossRef]
  27. Dryden, D.T. Thomson, A.R.White, J.H. How much of protein sequence space has been explored by life on Earth? Journal of The Royal Society Interface 2008, 5, 953–956. [CrossRef]
  28. Kauffman. Investigations; Oxford University Press, 2000. [Google Scholar]
  29. Kauffman, S.A.; Roli, A. A third transition in science? Interface Focus 2023, 13, 20220063. [Google Scholar] [CrossRef] [PubMed]
  30. Kauffman, S.A. Prolegomenon to patterns in evolution. Biosystems 2014, 123, 3–8. [Google Scholar] [CrossRef]
  31. Kempes, C.P.Koehl, M.West, G.B. The scales that limit: the physical boundaries of evolution. Frontiers in Ecology and Evolution 2019, 7, 242. [CrossRef]
  32. Kempes, C.P.Krakauer, D.C. The multiple paths to multiple life. Journal of molecular evolution 2021, 89, 415–426. [CrossRef] [PubMed]
  33. Koorambas, E. Particle in a Markov Cube by the Non-Classical Information Entropy Space. Int J Theor Phys 2024, 63, 81. [Google Scholar] [CrossRef]
  34. Aguirre, J.Catalán, P.Cuesta, J.A.Manrubia, S. On the networked architecture of genotype spaces and its critical effects on molecular evolution. Open biology 2018, 8, 180069. [CrossRef]
  35. Manrubia, S. The simple emergence of complex molecular function. Philosophical Transactions of the Royal Society A 2022, 380, 20200422. [Google Scholar] [CrossRef] [PubMed]
  36. Alberch, P. From genes to phenotype: dynamical systems and evolvability. Genetica 1991, 84, 5–11. [Google Scholar] [CrossRef] [PubMed]
  37. Gavrilets, S. Gravner, J. Percolation on the fitness hypercube and the evolution of reproductive isolation. Journal of theoretical biology 1997, 184, 51–64. [CrossRef] [PubMed]
  38. Fontana, W. Modelling ‘evo-devo’with RNA. BioEssays 2002, 24, 1164–1177. [Google Scholar] [CrossRef]
  39. Schultes, E.A. Bartel, D.P. One sequence, two ribozymes: implications for the emergence of new ribozyme folds. Science 2000, 289, 448–452. [CrossRef]
  40. Li, H. Helling, R.Tang, C. Wingreen, N. Emergence of preferred structures in a simple model of protein folding. Science 1996, 273, 666–669. [CrossRef]
  41. Li, H. Tang, C. Wingreen, N. Are protein folds atypical? Proc. Nat. Acad. Sci. USA 1998, 95, 4987–4990. [CrossRef]
  42. Denton, M. The Protein Folds as Platonic Forms: New Support for the Pre-Darwinian Conception of Evolution by Natural Law. J. Theor. Biol. 2002, 219, 325–342. [Google Scholar] [CrossRef]
  43. Banavar, J.R. Maritan, A. Colloquium: Geometrical approach to protein folding: a tube picture. Reviews of Modern Physics 2003, 75, 23. [CrossRef]
  44. Marshall, S.M.; Mathis, C.; Carrick, E.; Keenan, G.; Cooper, G.J.; Graham, H.; Craven, M.; Gromski, P.S.; Moore, D.G.; Walker, S.I.; others. Identifying molecules as biosignatures with assembly theory and mass spectrometry. Nature communications 2021, 12, 3033. [Google Scholar] [CrossRef]
  45. Ahnert, S.E. Johnston, I.G. Fink, T.M. Doye, J.P. Louis, A.A. Self-assembly, modularity, and physical complexity. Physical Review E 2010, 82, 026117. [CrossRef]
  46. Sharma, A.; Czégel, D.; Lachmann, M.; Kempes, C.P.; Walker, S.I.; Cronin, L. Assembly theory explains and quantifies selection and evolution. Nature 2023, 622, 321–328. [Google Scholar] [CrossRef] [PubMed]
  47. Bhagavan, N.V. Medical Biochemistry, United States; Elsevier Inc: New York, 2002; ISBN 978-0-12-095440-7. [Google Scholar]
  48. Shen, Chang-Hui. Diagnostic Molecular Biology, United States (2023), ISBN 978-0-323-91788-9 New York; Elsevier Inc, 2023. [Google Scholar] [CrossRef]
  49. Livadiotis, G.; McComas, D. J. Entropy 2021, 23, 1683. [CrossRef]
  50. Livadiotis, G. Entropy 2014, 16, 4290. [CrossRef]
  51. Beck, C.; Cohen, E. G. D. Superstatistics. Phys. A 2003, 322, 267–275. [Google Scholar] [CrossRef]
  52. Schwadron, N. A.; et al. Superposition of stochastic processes and the resulting particle distributions. Astrophys. J. 2010, 713, 1386–1392. [Google Scholar] [CrossRef]
  53. Hanel, R.; Turner, S.; Gell-Mann, M. Generalized entropies and the transformation group of superstatistics. Proc. Natl. Acad. Sci. U.S.A 2012, 108, 6. [Google Scholar] [CrossRef]
  54. Livadiotis, G.; Assas, L.; Dennis, B.; Elaydi, S.; Kwessi, E. Kappa function as a unifying framework for discrete population modelling. Nat. Res. Mod. 2016, 29, 130–144. [Google Scholar] [CrossRef]
  55. Livadiotis, G. Rankine-Hugoniot shock conditions for space and astrophysical plasmas described by Kappa distributions. Astrophys. J. 2019, 886, 3. [Google Scholar] [CrossRef]
  56. Gravanis, E.; Akylas, E.; Livadiotis, G. Physical meaning of temperature in superstatistics. Europhys. Lett. 2020, 130, 30005. [Google Scholar] [CrossRef]
  57. Zank, G. P.; et al. Particle acceleration at perpendicular shock waves: Model and observations. J. Geophys. Res. 2006, 111, A06108. [Google Scholar] [CrossRef]
  58. Bian, N.; Emslie, G. A.; Stackhouse, D. J.; Kontar, E. P. Te formation of a kappa-distribution accelerated electron populations in solar fares. Astrophys. J. 2014, 796, 142. [Google Scholar] [CrossRef]
  59. Yoon, P. H. Electron kappa distribution and steady-state Langmuir turbulence. Plasma Phys. 2012, 19, 052301. [Google Scholar] [CrossRef]
  60. Yoon, P. H. Electron kappa distribution and quasi-thermal noise. J. Geophys. Res. 2014, 119, 7074. [Google Scholar] [CrossRef]
  61. Livadiotis, G.; McComas, D. J. Exploring transitions of space plasmas out of equilibrium. Astrophys. J. 2010, 714, 971–987. [Google Scholar] [CrossRef]
  62. Livadiotis, G.; McComas, D. J. Infuence of pickup ions on space plasma distributions. Astrophys. J. 2011, 738, 64. [Google Scholar] [CrossRef]
  63. Livadiotis, G.; McComas, D. J. Transport equation of kappa distributions in the heliosphere. In Astrophys. J., to appear; 2023. [Google Scholar]
  64. Fisk, L. A.; Gloeckler, G. The case for a common spectrum of particles accelerated in the heliosphere: Observations and theory. J. Geophys. Res. 2014, 119, 8733–8749. [Google Scholar] [CrossRef]
  65. Peterson, J.; Dixit, P. D.; Dill, K. A. A maximum entropy framework for non-exponential distributions. Proc. Natl. Acad. Sci. U.S.A 2013, 110, 20380–20385. [Google Scholar] [CrossRef]
  66. Livadiotis, G.; Desai, M. I.; Wilson, L. B., III. Generation of kappa distributions in solar wind at 1 AU. Astrophys. J. 2018, 853, 142. [Google Scholar] [CrossRef]
  67. Livadiotis, G. On the origin of polytropic behavior in space and astrophysical plasmas. Astrophys. J. 2019, 874(10), 390–6394. [Google Scholar] [CrossRef]
  68. Boltzmann, L. Über die Mechanische Bedeutung des Zweiten Hauptsatzes der Wärmetheorie. Wiener Berichte 1866, 53, 195–220. [Google Scholar]
  69. Gibbs, J. W. Elementary principles in statistical mechanics; Scribner’s sons, 1902. [Google Scholar]
  70. Maxwell, J. C. Illustrations of the dynamical theory of gases, on the motions and collisions of perfectly elastic spheres. Philos. Mag. 1860, 19, 19–32. [Google Scholar] [CrossRef]
  71. Livadiotis, G.; McComas, D.J. Invariant kappa distribution in space plasmas out of equilibrium. Astrophys. J. 2011, 741, 88. [Google Scholar] [CrossRef]
  72. Livadiotis, G. Kappa and q indices: Dependence on the degrees of freedom. Entropy 2015, 17, 2062–2081. [Google Scholar] [CrossRef]
  73. Livadiotis, G. Kappa Distribution: Theory & Applications in Plasmas, 1st ed.; Elsevier: Amsterdam, The Netherlands; London, UK; Atlanta, GA, USA, 2017. [Google Scholar]
  74. Livadiotis, G.; McComas, D.J. Understanding kappa distributions: A toolbox for space science and astrophysics. Space Sci. Rev. 2013, 75, 183–214. [Google Scholar] [CrossRef]
  75. Glansdorff, P.; Prigogine, I. On a general evolution criterion in macroscopic physics. Physica 1964, 30, 351–374. [Google Scholar] [CrossRef]
  76. Landauer, R. Irreversibility and heat generation in the computing process. IBM journal of research and development 1961, 5, 183–191. [Google Scholar] [CrossRef]
  77. Kolchinsky, A.; Corominas-Murtra, B. Decomposing information into copying versus transformation. Journal of the Royal Society Interface 2020, 17, 20190623. [Google Scholar] [CrossRef] [PubMed]
  78. Kempes, C.P.; Wolpert, D.; Cohen, Z; Pérez-Mercader, J. The thermodynamic efficiency of computations made in cells across the range of life. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 2017, 375, 20160343. [Google Scholar] [CrossRef] [PubMed]
  79. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  80. Havrda, J.; Charvát, F. Quantifcation method of classifcation processes concept of structural α-entropy. Kybernetika 1967, 3, 30–35. [Google Scholar]
  81. Daróczy, Z. Generalized information functions. Inf. Control 1970, 16, 36–51. [Google Scholar] [CrossRef]
  82. Nivanen, L.; Le Méhauté, A.; Wang, Q.A. Reports on Mathematical Physics 2003, 52, 437. [CrossRef]
  83. Sears, W; Salinger, L. Thermodynamics, Kinetic Theory, and Statistical Thermodynamics, 3rd edition; Addison-Wesley, 1986. [Google Scholar]
  84. Sandler, S. I. An Introduction Applied to Statistical Thermodynamics; John Wiley & Sons: NJ, 2011. [Google Scholar]
  85. McQuarrie, D. A. Statistical Mechanics; University Science Books: Sausalito, CA, 2000. [Google Scholar]
  86. Lucia, U. Probability, ergodicity, irreversibility and dynamical systems. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 2008, 464, 1089–1104. [Google Scholar] [CrossRef]
  87. Popovic, M. Entropy change of open thermodynamic systems in self-organizing processes. Thermal Science 2014, vol 18(Nr 4), pp 1425–1432. [Google Scholar] [CrossRef]
  88. Aderson, P.W. More is Different. SCIENCE 1972, 177, 4047. [Google Scholar]
  89. Bohm, D. A Suggested Interpretation of the Quantum Theory in Terms of “Hidden Variables” I. Phys. Rev. 1952, 85(2), 166–179. [Google Scholar] [CrossRef]
  90. Bohm, D. A Suggested Interpretation of the Quantum Theory in Terms of “Hidden Variables”, II. Phys. Rev. 1952, 85(2), 180–193. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated