Preprint
Article

This version is not peer-reviewed.

Entropic Force and Self-Organization: A Colloidal Boltzmann Machine Model

Submitted:

13 July 2025

Posted:

21 July 2025

You are already at the latest version

Abstract
Traditionally, the correction factor lnN!, which resolves the Gibbs paradox, has been attributed to the quantum indistinguishability of identical particles. However, recent advances in colloid science reveal that this factor is also essential for understanding the collective behavior of classically distinguishable particles. This paper demonstrates that the inclusion of the correction factor lnN! gives rise to the emergence of a zero-point energy, which may induce a distinct type of entropic force. Building on this finding, we propose a Boltzmann machine consisting of classically distinguishable colloidal particles. Through this model, we demonstrate how the zero-point energy induced by the factor lnN! can drive the self-organizing evolution of colloidal systems in an environment of random fluctuations, such as Brownian motion. This mechanism can be applied to various fields of information technology, including information storage, intelligent coding, and cryptography.
Keywords: 
;  ;  ;  ;  
Subject: 
Physical Sciences  -   Biophysics

1. Introduction

The concept of entropy has long served as a cornerstone for understanding the transition from disorder to order in both physical and biological systems [1,2,3,4]. Historically, two pivotal challenges—Maxwell’s demon and the Gibbs paradox—have profoundly shaped the foundations of thermodynamics by probing the limits of classical entropy interpretations. While Maxwell’s demon exposed the interplay between information and thermodynamics, the Gibbs paradox revealed a critical inconsistency in entropy calculations during the mixing of ideal gases [5,6]. Traditional resolutions attributed the paradox to the quantum indistinguishability of identical particles [7], yet this explanation falters when extended to classical systems [8,9,10], such as colloidal suspensions [11,12], where particles remain microscopically distinguishable. Consider a typical colloidal suspension [12]: The system comprises approximately 1012 micrometer-sized particles, each composed of roughly 1012 atoms. Due to differences in shape and size [13], no two colloid particles are exactly the same, making them classically distinguishable particles. Yet, surprisingly, the experimentally observed phase behavior matches perfectly with the statistical mechanics predictions based on the assumption of “identical particles” [11,12]. This apparent contradiction is particularly evident in colloid crystallization experiments. When the volume fraction of a monodisperse hard-sphere colloid suspension reaches a critical value, the system spontaneously crystallizes into an ordered structure. It has been argued that an entropy expression incorporating the correction factor l n N ! can describe this self-organization process [12].
In the present study, we further demonstrate that the inclusion of l n N ! gives rise to the emergence of a zero-point energy, which may induce a distinct type of entropic force. An entropic force is not a fundamental interaction, but rather a macroscopic force of statistical significance [14], arising from the maximization of entropy within a system. Since the entropic force points in the direction of maximum entropy, it exhibits self-organizing characteristics when acting on macroscopic systems. In this regard, it has already been well-established as a key driver of self-organization in soft matter systems [2,3,4,15,16,17,18]. Unlike hard matter, soft matter exhibits more diverse and prevalent self-organization phenomena [19,20,21,22], extending even to social and biological systems [23,24,25,26]. In this context, entropic forces have been invoked to explain emergent properties of life [27,28] and intelligence [29,30]. Building on these insights, we propose a Boltzmann machine consisting of classically distinguishable colloidal particles. Through this model, we demonstrate that the zero-point energy induced by the factor l n N ! may serve as a mechanism driving the self-organizing evolution of colloidal systems in an environment of random fluctuations. In existing literature [31], an atomic Boltzmann machine has been realized. By contrast, we propose a colloidal version in which colloidal particles are classically distinguishable, thereby serving as observable neurons.
The remainder of this paper is organized as follows. Section 2 introduces the statistical-mechanical entropy framework, distinguishing between Gibbs and Boltzmann entropy formulations. Section 3 expresses the Gibbs entropy in terms of macro-thermodynamic variables. Section 4 expresses the Boltzmann entropy in terms of macro-thermodynamic variables. Section 5 demonstrates how l n N ! induces an entropic force and examines its manifestations in ideal gases. Section 6 proposes a colloidal version of the Boltzmann machine. Finally, Section 7 concludes the paper.

2. Statistical-Mechanical Entropy

We begin by considering classically distinguishable particles obeying the Boltzmann distribution (see page 140 in [32]):
a i = g i e α + β ε i ,
with i = 1,2 , , n and ε 1 < ε 2 < < ε n , where a i represents the occupation number of the i -th energy level ε i , g i denotes the degeneracy of the i -th energy level, α and β are thermodynamic parameters.
In classical statistical mechanics, two fundamental entropy formulations emerge: the Gibbs entropy S G and the Boltzmann entropy S B . Using the Boltzmann distribution (1), these entropies can be expressed as [32]:
S G = k B l n Ω B k B l n N ! ,
and
S B = k B l n Ω B
where the microstate count Ω B is given by:
Ω B = N ! i = 1 n a i ! i = 1 n g i a i .
The key difference between these two expressions lies in the l n N ! term, which serves as the correction factor resolving the Gibbs paradox. Historically, following Planck’s seminal work [7], this term was interpreted as accounting for the indistinguishability of identical particles. This interpretation suggests that the Gibbs entropy (2) describes indistinguishable particles, whereas the Boltzmann entropy (3) applies to distinguishable particles. However, recent research advances in colloid science have revealed that the correction factor l n N ! remains crucial even for systems of classically distinguishable particles [6,8,9,10,11,12]. In the present study, we demonstrate that this correction factor gives rise to a distinct entropic force. To elucidate this, we need to express both entropy formulations in terms of four macroscopic thermodynamic variables: particle number, internal energy, temperature, and chemical potential. This can be accomplished using the Boltzmann distribution (1). Under this distribution, the particle number N and the internal energy U are given by:
N = i = 1 n a i = i = 1 n g i e α + β ε i ,
U = i = 1 n a i ε i = i = 1 n g i ε i e α + β ε i .

3. Gibbs Entropy Picture

We first express the Gibbs entropy formula (2) in terms of macro-thermodynamic variables. Without loss of generality, we assume that the energy level ε i depend on volume V ; that is, ε i = ε i V for i = 1,2 , , n . Based on this assumption, we combine equations (5) and (6) to obtain [10,30,33,34,35,36]:
d U = α β d N + 1 β d N + α N + β U + i = 1 n g i e α + β ε i d ε i d V d V .
A detailed derivation of equation (7) is provided in Appendix A of [10].
On the other hand, when a i 1 , applying Stirling’s approximation l n N ! N l n N N to equation (4) yields [10]:
l n Ω B = l n N ! + N + α N + β U .
The detailed derivation of equation (8) is provided in Appendix B of [10].
Equation (8) allows us to rewrite the Gibbs entropy formula (2) equivalently as:
S G = k B N + α N + β U ,
Substituting the Gibbs entropy formula (9) into equation (7) gives:
d U = α β d N + 1 k B β d S G + i = 1 n g i e α + β ε i d ε i d V d V .
The fundamental thermodynamic equation is well-known as [37]:
d U = μ d N + T d S P d V ,
where μ denotes the chemical potential, T denotes the temperature, P denotes the pressure, and S denotes the thermodynamic entropy.
By comparing equations (10) and (11), we obtain:
μ = α β ,
T = 1 k B β .
Substituting equations (12) and (13) into the Gibbs entropy formula (9) yields:
S G = k B N T μ N + U T ,
which expresses the Gibbs entropy in terms of four macro-thermodynamic variables: N , U , T , and μ .
The mathematical rigor of deriving equation (14) is justified in [10]. In this paper, we refer to equation (14) as the Gibbs entropy picture. Under this picture, the internal energy U = U G  is given by:
U G = μ N + S k B N T .
Furthermore, using equations (12) and (13), the Boltzmann distribution (1) under the Gibbs entropy picture takes the well-known form:
a i = g i e ε i μ k B T ,
where i = 1,2 , , n .

4. Boltzmann Entropy Picture

To express the Boltzmann entropy in terms of macro-thermodynamic variables, we begin by reformulating equation (7) as follows:
d U = α β d N 1 β d l n N ! + 1 β d l n N ! + N + α N + β U + i = 1 n g i e α + β ε i d ε i d V d V .
Applying Stirling’s approximation l n N ! N l n N N to equation (17) yields:
d U = α + l n N β d N + 1 β d N l n N + α N + β U + i = 1 n g i e α + β ε i d ε i d V d V .
Using equation (8) and the Stirling approximation, we can rewrite the Boltzmann entropy formula (3) as:
S B = k B N l n N + α N + β U ,
Substituting the Boltzmann entropy formula (19) into equation (18) gives:
d U = α + l n N β d N + 1 k B β d S B + i = 1 n g i e α + β ε i d ε i d V d V .
Comparing equations (11) and (20), we obtain:
μ = α + l n N β ,
T = 1 k B β .
Substituting equations (21) and (22) into the Boltzmann entropy formula (19) yields:
S B = μ N + U T ,
which expresses the Boltzmann entropy in terms of four macro-thermodynamic variables: N , U , T , and μ .
The mathematical rigor of deriving equation (23) is justified in [10]. In this paper, we refer to equation (23) as the Boltzmann entropy picture. Under this picture, the internal energy U = U B is given by:
U B = μ N + S T .
Furthermore, using equations (21) and (22), the Boltzmann distribution (1) under the Boltzmann entropy picture takes the particular form:
a i = N g i e ε i μ k B T ,
where i = 1,2 , , n .
The Boltzmann distribution (25) under the Boltzmann entropy picture differs from the Boltzmann distribution (15) under the Gibbs entropy picture by a factor of N . This distinction is crucial as it highlights the different assumptions underlying each entropy formulation. Later, we will demonstrate that the Boltzmann distribution (25) under the Boltzmann entropy picture describes distinguishable particles that are “localized”—restricted to specific positions.

5. Entropic Force Arising from the Correction Factor l n N !

By comparing equations (15) and (24), we observe that the correction factor l n N ! introduces a contribution of k B N T to the internal energy U G under the Gibbs entropy picture. In this section, we demonstrate that this term fundamentally arises from an entropic force.
As a macroscopic effective force, the entropic force is characterized by two features [14]:
(A). The entropic force points in the direction of increasing entropy.
(B). The entropic force is proportional to the temperature1.
To elucidate how the correction factor l n N ! induces such an entropic force, we analyze the system’s free energy:
F = U T S .
In thermodynamic equilibrium, a system minimizes its free energy. We evaluate the free energy under both the Boltzmann and Gibbs entropy formulations.
Under the Boltzmann entropy picture, the free energy is expressed as:
F B = U B T S ,
where U B is defined by equation (24).
Under the Gibbs entropy picture, the free energy becomes:
F G = U G T S ,
with U G given by equation (15).
From equations (15) and (24), we derive the relationship:
U G = U B k B N T ,
allowing equation (28) to be rewritten as:
F G = U B T S e f f ,
where
S e f f = S + k B N
represents an effective entropy.
Since the term k B N T in equation (29) arises from the l n N ! correction, a comparison between equations (27) and (30) reveals that including l n N ! effectively increases the entropy by k B N . This increase in entropy, which we refer to as an “effective entropy enhancement,” reduces the free energy F G relative to F B , with the magnitude of reduction proportional to the temperature T . These observations align with the defining features of entropic forces, particularly their temperature dependence and role in increasing entropy, strongly suggesting that the k B N T term represents the work done by an entropic force. By lowering F G , this force drives the system toward a more thermodynamically stable state with minimized free energy.
Here, we demonstrate that for an ideal gas, the entropic force induced by the correction factor l n N ! manifests as repulsive interactions among particles. An ideal gas implies that the system is extensive, meaning that we have [37]
U λ N , λ S , λ V = λ U N , S , V .
By Euler’s theorem on homogeneous functions, equation (32) transforms the fundamental thermodynamic equation (11) into:
U = μ N + T S P V ,
where, μ = U N , S , V / N , T = U N , S , V / S , and P = U N , S , V / V .
Substituting equation (33) into equation (15) yields:
P V = k B N T ,
which is the ideal gas law.
Equation (34) reveals that the term k B N T generates a non-zero pressure among ideal gas particles at T 0 . This pressure, which emerges from the l n N ! correction, represents a purely statistical entropic force that is fundamentally different from conventional interaction-based pressures. The critical role of this correction becomes evident when considering the Boltzmann entropy picture: Omitting l n N ! reduces the internal energy to that given by equation (24), which, when combined with equation (33), yields:
P V = 0 .
This null result confirms that the pressure in equation (34) is indeed an entropic force arising from the l n N ! term.
At absolute zero temperature T = 0 , equations (34) and (35) coincide, implying that the Gibbs entropy (2) and the Boltzmann entropy (3) have the same effect in this context. Absolute zero temperature represents the complete absence of thermal motion, and in the classical sense, it can be understood as particles being in a state of absolute rest. Therefore, the Boltzmann entropy (3) essentially describes classical particles, albeit these classical particles are “localized”—restricted to specific positions. Hence, we propose using the Boltzmann distribution under the Boltzmann entropy picture, i.e., equation (25), to describe localized, distinguishable particles.
The above analysis assumes ideal gases. We now extend our discussion to colloidal systems.

6. Colloidal Boltzmann Machine

In Section 5, we established that the term k B N T originates from an entropic force associated with the correction factor l n N ! . For an ideal gas, the entropic force manifests as repulsive interactions among particles, implying the incompressibility of the gas. This observation further suggests the presence of zero-point energy in the system. To verify that the correction factor l n N ! indeed leads to zero-point energy, we reformulate equation (15) under the Gibbs entropy picture as a partial differential equation:
N U N , S , V N + S k B N U N , S , V S = U N , S , V .
Recognizing that μ = U N , S , V / N and T = U N , S , V / S , equation (36) becomes equivalent to equation (15). Here, the term k B N explicitly encodes the contribution of l n N ! .
The general solution to the partial differential equation (36) is given by:
Φ U N , S k B N + l n N = 0 ,
where Φ x , y is a smooth function. The mathematical details for deriving this solution can be found in references [33,34,35,36]. We now demonstrate that equation (37) ensures the existence of zero-point energy.
Proof. 
Zero-point energy requires U 0 . Assuming U = 0 , equation (37) yields S = c o n s t · k B N k B N   l n N . For N 1 , this expression becomes negative, violating the non-negativity of the entropy ( S 0 ). This contradiction confirms the necessity of zero-point energy. ■
Without the correction factor l n N ! , however, the zero-point energy would vanish2. The proof above merely demonstrates the existence of the zero-point energy. Later, we will show that for systems at constant temperature, the zero-point energy can be derived.
To explore the physical significance of zero-point energy, we consider a Boltzmann machine composed of classically distinguishable colloidal particles, where each particle acts as a neuron. In this framework, the neural network’s internal energy U = U z is expressed as a function of the colloidal state variables. Let z = z 1 , z 2 , , z N represent the state vector for N colloidal particles, where z i = 1 denotes an “active” colloid and z i = 0 indicates an “inactive” state3. Within this representation, the internal energy U z can be expanded as a Taylor series:
U z = i = 1 N c i z i i = 1 N j = 1 N ω i j z i z j ,
where ω i j represents the weights and c i the biases. Higher-order terms have been omitted in equation (38).
Prior work [31] has employed atomic spin systems to emulate Boltzmann machines. Here, we propose a colloidal version in which colloidal particles are classically distinguishable, thereby serving as observable neurons. For instance, colloidal particles with sizes on the micrometer scale can be individually tracked. Thus, the activity of colloidal neurons can be directly monitored in experiments.
Since colloidal systems with negligible interactions obey the Boltzmann distribution (16), the joint probability distribution of N colloidal particles in the state vector z = z 1 , z 2 , , z N can be expressed as [33,36]:
P z = e x p S z k B ,
where the entropy S z is determined by the general solution (37).
The derivation of equation (39) is detailed in references [33,34,35], with its mathematical rigor rigorously established in [36]. To elucidate its physical significance, this equation can be interpreted as the information entropy formula S z = k B l n P z , which quantifies the uncertainty associated with the microscopic state z of the system.
We consider a colloidal system in thermodynamic equilibrium, which necessitates a constant temperature T . Under this condition, the general solution (37) reduces to the following form4 [33,36]:
S = U T k B N   l n N .
Applying Stirling’s approximation l n N ! N   l n N N , equation (40) becomes:
S = U T k B l n N ! .
Since entropy S is non-negative, equation (41) directly leads to the inequality:
U k B T   l n N ! ,
where the equality condition defines the zero-point energy of the constant-temperature system:
U 0 = k B T   l n N ! .
Equation (43) highlights that the zero-point energy arises from the collective behavior of colloidal particles. For instance, with a single particle ( N = 1 ), U 0 = k B T   l n 1 = 0 , confirming its dependence on system size.
Substituting equation (41) into (39) yields a Boltzmann machine model5 [33]:
P z = 1 Z e x p U z k B T ,
where the partition function Z is given by:
Z = z e x p U z k B T = 1 N ! .
Here z · denotes summation over all possible configurations of z .
From equation (44), the conditional probabilities for particle states follow [41]:
P z i = 1 z i = 1 1 + e x p j = 1 N ω i j z j + c i k B T ,
P z i = 0 z i = 1 P z i = 1 z i ,
where z i represents the states of all particles except the i -th one.
In the low-temperature limit ( T 0 ), equation (46) reduces to a step function: P z i = 1 z i 1 when j = 1 N ω i j z j + c i 0 , and P z i = 1 z i 0 otherwise. At this limit, the Boltzmann machine (44) converges to a Hopfield network, where the state of the i -th particle is governed by the McCulloch-Pitts rule:
z i = 1 , j = 1 N ω i j z j + c i 0 0 , j = 1 N ω i j z j + c i < 0 .
Despite6 T 0 (but T 0 ), for N 1 , the zero-point energy U 0 , given by equation (43), remains finite. Like large-scale molecules such as pollen grains, colloidal particles are subject to random fluctuations [21,22], such as Brownian motion. Thus, Brownian motion introduces stochastic state changes, but the Hopfield network ensures that each update monotonically decreases the system’s energy [42]. This process halts when U approaches U 0 , as mandated by equation (42). Consequently, the system reaches an equilibrium vector z 1 * , z 2 * , , z N * to prevent U < U 0 . Crucially, U U 0 requires at least one particle to remain active ( z i * = 1 ); otherwise, U = 0 , violating equation (42). Thus, the scenario where z i * = 0 for all i = 1 , , N is forbidden. This robustness feature is analogous to the incompressibility observed in ideal gases, which is due to entropic forces that require T > 0 . For example, when T = 0 , the equilibrium vector z 1 * , z 2 * , , z N * would become trivial, with z i * = 0 for all i = 1 , , N , according to equation (43).
The convergence of the state vector z 1 , z 2 , , z N illustrates a self-organization process7: under random fluctuations (such as Brownian motion), the system evolves from disorder (high-energy states) to order (low-energy equilibrium), with at least one active colloid. This mechanism can be applied to various fields of information technology, including information storage, intelligent coding, and cryptography. For example, due to the robustness of the equilibrium vector z 1 * , z 2 * , , z N * , information remains unaltered and undestroyed under random shocks. By contrast, if the zero-point energy U 0 vanishes, the equilibrium vector z 1 * , z 2 * , , z N * would become trivial, with z i * = 0 for all i = 1 , , N . In such a case, the applications would become infeasible.

7. Conclusion

Our analysis demonstrates that incorporating the correction factor l n N ! gives rise to the emergence of a zero-point energy, which may induce a distinct type of entropic force. In ideal gases, this entropic force manifests as effective repulsive interactions between particles, accounting for their incompressibility. Building on this insight, we have proposed a novel Boltzmann machine composed of classically distinguishable colloidal particles. Through this model, we demonstrate that the zero-point energy induced by the factor l n N ! may serve as a mechanism driving the self-organizing evolution of colloidal systems in low-temperature regimes when subjected to random fluctuations, such as Brownian motion. This mechanism can be applied to various fields of information technology, including information storage, intelligent coding, and cryptography.

Funding

This research was supported by the Research Project on Education and Teaching Reform in Southwest University (Grant No. 2021JY045). Correspondence to: taoyingyong@swu.edu.cn.
1
The entropic force typically increases with temperature. However, an inverse dependence has also been reported, for example, see [38].
2
When the correction factor l n N ! is absent, equation (24) holds. It is expressed as: N U N , S , V N + S U N , S , V S = U N , S , V , which has the general solution Φ U N , S N = 0 . In this scenario, we may have U = 0 so that S = c o n s t · N . That is, the zero-point energy vanishes.
3
The binary states z i 0 , 1 may be physically realized in colloidal systems through, for example, active switching between two distinct interaction states, such as size or conformational changes, as demonstrated experimentally and theoretically in [39,40].
4
This indicates that U / S = T is a constant. Substituting this condition into equation (37) yields equation (40).
5
Equation (44) is referred to as the “Self-referential Boltzmann Machine” [33], which is used to emulate self-motivated systems in a biological context. Due to the existence of the constraint condition (45), the self-referential Boltzmann machine differs from the traditional Boltzmann machine.
6
Here, the condition T 0 is primarily a requirement for theoretical analysis. In practical applications, a finite temperature can ensure that the McCulloch-Pitts rule (48) is approximately satisfied.
7
This behavior parallels the self-assembly mechanisms observed in colloidal molecules [18,19,20].

References

  1. Schrodinger, E. What is Life? With Mind and Matter and Autobiographical Sketches; Cambridge University Press, 1992. [Google Scholar]
  2. Frenkel, D. Entropy-driven phase transitions. Physica A 1999, 263, 26–38. [Google Scholar] [CrossRef]
  3. Frenkel, D. Order through entropy. Nature Materials 2015, 14, 9–12. [Google Scholar] [CrossRef] [PubMed]
  4. Cates, M.E. Entropy stabilizes open crystals. Nature Materials 2013, 12, 179–180. [Google Scholar] [CrossRef] [PubMed]
  5. Gibbs, J.W. Elementary Principles in Statistical Mechanics; Elementary Principles in Statistical Mechanics: New Haven, CT, 1902. [Google Scholar]
  6. Jaynes, E.T. The Gibbs Paradox; Smith, C.R., Erickson, G.J., Neudorfer, P.O., Eds.; Kluwer Academic Publishers: Dordrecht, 1992. [Google Scholar]
  7. Planck, M. Zur Frage der Quantelung einatomiger Gase. Sitz.ber. Preuss. Akad. Wiss. (Berlin) 1925, 49. [Google Scholar]
  8. Swendsen, R.H. Statistical Mechanics of Classical Systems with Distinguishable Particles. Journal of Statistical Physics 2002, 107, 1143–1166. [Google Scholar] [CrossRef]
  9. Peters, H. Statistics of Distinguishable Particles and Resolution of the Gibbs Paradox of the First Kind. Journal of Statistical Physics 2010, 141, 785–828. [Google Scholar] [CrossRef]
  10. Tao, Y. Gibbs Paradox and Thermodynamics of Colloids. Physics Letters A 2025, 547, 130531. [Google Scholar] [CrossRef]
  11. Frenkel, D. Why colloidal systems can be described by statistical mechanics: Some not very original comments on the Gibbs paradox. Molecular Physics 2014, 112, 2325. [Google Scholar] [CrossRef]
  12. Cates, M.E.; Manoharan, V.N. Celebrating Soft Matter’s 10th anniversary: Testing the foundations of classical entropy: Colloid experiments. Soft Matter 2015, 11, 6538. [Google Scholar] [CrossRef] [PubMed]
  13. Manoharan, V.N. Colloidal matter: Packing, geometry, and entropy. Science 2015, 349, 1253751. [Google Scholar] [CrossRef] [PubMed]
  14. Verlinde, E. On the origin of gravity and the laws of Newton. Journal of High Energy Physics 2011, 2011, 29. [Google Scholar] [CrossRef]
  15. Dussi, S.; Dijkstra, M. Entropy-driven formation of chiral nematic phases by computer simulations. Nature Communications 2016, 7, 11175. [Google Scholar] [CrossRef] [PubMed]
  16. Zhang, R.; et al. Entropy-driven segregation of polymer-grafted nanoparticles under confinement. Proceedings of the National Academy of Sciences 2017, 114, 2462–2467. [Google Scholar] [CrossRef] [PubMed]
  17. Alberstein, R.; et al. Engineering the entropy-driven free-energy landscape of a dynamic nanoporous protein assembly. Nature Chemistry 2018, 10, 732–739. [Google Scholar] [CrossRef] [PubMed]
  18. Lee, S.; et al. Entropy compartmentalization stabilizes open host-guest colloidal clathrates. Nature Chemistry 2023, 15, 905–912. [Google Scholar] [CrossRef] [PubMed]
  19. Zhuang, Y.; Charbonneau, P. Recent Advances in the Theory and Simulation of Model Colloidal Microphase Formers. Journal of Physical Chemistry B 2016, 120, 7775–7782. [Google Scholar] [CrossRef] [PubMed]
  20. Araújo, N.A.M.; et al. Nonequilibrium self-organization of colloidal particles on substrates: adsorption, relaxation, and annealing. Journal of Physics: Condensed Matter 2017, 29, 014001. [Google Scholar] [CrossRef] [PubMed]
  21. Speck, T.; et al. Focus on Active Colloids and Nanoparticles. New Journal of Physics 2020, 22, 060201. [Google Scholar] [CrossRef]
  22. Sesé-Sansa, E.; et al. Phase separation of self-propelled disks with ferromagnetic and nematic alignment. Physical Review E 2021, 104, 054611. [Google Scholar] [CrossRef] [PubMed]
  23. Couzin, I. Collective minds. Nature 2007, 445, 715. [Google Scholar] [CrossRef] [PubMed]
  24. Yukalov, Y.I.; Sornette, D. Self-organization in complex systems as decision making. Advances in Complex Systems 2014, 17, 1450016. [Google Scholar] [CrossRef]
  25. Mahmoodi, K.; et al. Self-Organized Temporal Criticality: Bottom-Up Resilience versus Top-Down Vulnerability. Complexity 2018, 2018, 8139058. [Google Scholar] [CrossRef]
  26. Perc, M. The social physics collective. Scientific Reports 2019, 9, 16549. [Google Scholar] [CrossRef] [PubMed]
  27. Martyushev, L.M.; Seleznev, V.D. Maximum entropy production principle in physics, chemistry and biology. Physics Reports 2006, 426, 1–45. [Google Scholar] [CrossRef]
  28. Kleidon, A. The second law of thermodynamics, life and Earth‘s planetary machinery revisited. Physics of Life Reviews 2024, 51, 382–389. [Google Scholar] [CrossRef] [PubMed]
  29. Wissner-Gross, A.D.; Freer, C.E. Causal entropic forces. Physical Review Letters 2013, 110, 168702. [Google Scholar] [CrossRef] [PubMed]
  30. Tao, Y. Swarm intelligence in humans: A perspective of emergent evolution. Physica A 2018, 502, 436–446. [Google Scholar] [CrossRef]
  31. Kiraly, B.; et al. An atomic Boltzmann machine capable of self-adaption. Nature Nanotechnology 2021, 16, 414–420. [Google Scholar] [CrossRef] [PubMed]
  32. Linder, B. Thermodynamics and Introductory Statistical Mechanics; John Wiley& Sons, Inc., 2004. [Google Scholar]
  33. Tao, Y. Self-referential Boltzmann machine. Physica A 2020, 545, 123775. [Google Scholar] [CrossRef]
  34. Tao, Y.; et al. Emerging social brain: a collective self-motivated Boltzmann machine. Chaos, Solitons & Fractals 2021, 143, 110543. [Google Scholar]
  35. Tao, Y. Life as a self-referential deep learning system: A quantum-like Boltzmann machine model. Biosystems 2021, 204, 104394. [Google Scholar] [CrossRef] [PubMed]
  36. Tao, Y. From Malthusian Stagnation to Modern Economic Growth: A swarm-intelligence perspective. Journal of Physics: Complexity 2024, 5, 025028. [Google Scholar] [CrossRef]
  37. Callen, H.B. Thermodynamics and an Introduction to Thermostatistics, 2nd ed.; Wiley: New York, 1985. [Google Scholar]
  38. Bormashenko, E. Magnetic Entropic Forces Emerging in the System of Elementary Magnets Exposed to the Magnetic Field. Entropy 2022, 24, 299. [Google Scholar] [CrossRef] [PubMed]
  39. Diba, F.S.; et al. Binary colloidal crystals (BCCs): Interactions, fabrication, and applications. Advances in Colloid and Interface Science 2018, 261, 102–127. [Google Scholar] [CrossRef] [PubMed]
  40. Bley, M.; et al. Active binary switching of soft colloids: stability and structural properties. Soft Matter 2021, 17, 7682. [Google Scholar] [CrossRef] [PubMed]
  41. Ackley, D.H.; Hinton, G.E.; Sejnowski, T.J. A learning algorithm for boltzmann machines. Cognitive Science 1985, 9, 147–169. [Google Scholar] [CrossRef]
  42. Hopfield, J.J. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences 1982, 79, 2554–2558. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated