A statistical inference of the Principle of Maximum Entropy Production

The maximization of entropy S within a closed system is accepted as an inevitability (as 1 the second law of thermodynamics) by statistical inference alone. The Maximum Entropy Production 2 Principle (MEPP) states that not only S maximizes, but Ṡ as well: a system will dissipate as fast as 3 possible. There is still no consensus on the general validity of this MEPP, even though it shows 4 remarkable explanatory power (both qualitatively and quantitatively), and has been empirically 5 demonstrated for many domains. In this theoretical paper I provide a generalization of entropy 6 gradients, to show that the MEPP actually follows from the same statistical inference, as that of 7 the 2nd law of thermodynamics. For this generalization I only use the concepts of super-statespaces 8 and microstate-density. These concepts also allow for the abstraction of ’Self Organizing Criticality’ 9 to a bifurcating local difference in this density, and allow for a generalization of the fundamentally 10 unresolved concepts of ’chaos’ and ’order’. 11


14
In the last few decades, several attempts have been made to explain the occurrence 15 and stability of what we call 'living systems', or 'order out of chaos'. Most notably, 16 the alleged conflict between 'autopoiesis' and the second law of thermodynamics is still 17 unresolved [12]. Despite the impressive progress and multidisciplinary approaches in 18 this field, there is still no conclusive consensus on any of these aspects [13].
19 20 This theoretical paper should provide this consensus, not by presenting yet another new 21 theory to describe living systems, but rather by generalizing existing concepts to prove 22 the so-called 'Maximum Entropy Production Principle', and then infer the occurrence 23 and local stability of autopoiesis as a statistical possibility. The well-known Boltzmann equation for thermodynamic entropy in a closed system is: where W indicates the number of microstates that correspond to a certain macrostate.

26
The well-known Second Law of Thermodynamics tells us that the state of a system will 27 tend towards macrostates with a higher W. The trivial example of such a macrostate is a 28 homogeneous distribution.

29
It's irreversibility comes from statistical inference: the state of a system has a higher 30 probability to have a macrostate with many microstates, than a macrostate with little 31 microstates. Therefore, over time, it's W (and S) will increase up to some maximum. Not only will a system maximize its entropy S over time, it will always do so as fast 34 as possible. This maximization ofṠ is called the 'maximum entropy production principle' 35 (MEPP) [1] [2]. This principle has been empirically validated for many domains, such as 36 the atmosphere [11] [6] [9], crystallization [5], enzyme reactions [3] and the economy [4].

37
Intuitively this makes sense, but this principle is still debated, and its ubiquitous validity 38 has not been demonstrated yet [10] [7] [8].

39
The materials and methods in this paper will provide for the generalizations neces-40 sary to finally infer this ubiquitous validity with the same logic as that of the second law 41 of thermodynamics, and then statistically infer the occurrence of 'order'.

52
An example of a system with a super-statespace with a single solution is the textbook-53 example of a thermodynamic closed system of some gas that dissipates energy. In this sense, the second law of thermodynamics can be illustrated by a trajectory of 63 the state towards an area with a higher microstate-density. Figure 1 shows the entropic 64 gradient of a system with a static statespace. It has a zero-dimensional super-statespace 65 (with a single solution, and only a single-dimensional entropic gradient).  All kinds of external and internal dynamics can alter the state-space, as the state finds the way of maximum entropy production. For example, biochemical structures can collide and form more stable structures, allowing for more complex dynamics. Local maxima of entropy production can emerge and disappear. These can mathematically be described as bifurcations from unstable to stable attractors (e.g. limit cycles) and vice versa, of non-linear entropy production dynamics. These local maxima, whereS = 0, are clearly visible in the heatmap in Figure 3, which is only a more extreme version of Figure 2. These local maxima are the sets of micro-states x which satisfy: If such a local maximum emerges (and the state of a system is caught in its basin of 92 attraction) we recognize this process as a self-organizing criticality. Eventually such a 93 maximum will disappear again (because of local saturation/exhaustion of entropy pro-94 duction, roughly speaking), and this implies a collapse of the self-organizing criticality.

95
For organic systems, this is called death.

96
The statespace then decreases in dimensionality again, and resumes its dissipative course, 97 or gets caught in another basis of attraction.

98
In other words, these areas mark the emergence of what we call 'order', but from this 99 generalization it is actually an acceleration towards more disorder (albeit local in space 100 and time). Or rather: order and chaos are generalized into relative concepts (instead of 101 discrete).
102 Figure 3. Regions with high entropic gradients towards higher dimensional complexity, which allow for self-organizing criticalities

Dependancy on initial state and dynamics 103
Whether or not the state of a system actually reaches such a self-organizing criticality 104 depends on its initial state, as shown in Figure 4. The example of the system of a tree is a good example of a fractal superstructure.

112
The dynamics of the state-space of this tree depends on at least its surrounding ecosys-113 tem and the weather. But these are also nonlinear dynamic systems, with highly intricate the micro-level all the cells of the tree (and in between its leaves and its seeds) can be 116 seen as subsystems with their own dynamics and dynamic state-spaces. So, ultimately,

117
the choice of what you consider a (sub)system is strictly arbitrary, as these dynamics 118 are all related. The same goes for our economic system, or a technological system. All 119 these systems can be seen as superstructures, hierarchical ensembles with some fractal 120 interdependency of non-linear dynamics, from micro to macro-level.

121
And from the same statistical inference as described above, eventually all these re-122 lated dynamics will tend towards an optimal distribution of local maxima of entropy 123 production, throughout the total ensemble.

125
In this article it is demonstrated that the maximization of entropy production that we 126 observe in many domains actually follows from the same statistical inference as that of

146
Ultimately this understanding may also help in developing much more effective and 147 empirically based policies at many levels, as every policy is then actually a matter 148 of increasing or decreasing complexity in some subregion, to keep them within some 149 thresholds that are aligned with some normative valuation framework, for example. It 150 suggests that to affect certain behavioral dynamics, it is much more effective to alter the 151 facilitating structural complexity, than to try to regulate this behavior.