1. Introduction
The concept of entropy has long served as a cornerstone for understanding the transition from disorder to order in both physical and biological systems [
1,
2,
3,
4]. Historically, two pivotal challenges—Maxwell’s demon and the Gibbs paradox—have profoundly shaped the foundations of thermodynamics by probing the limits of classical entropy interpretations. While Maxwell’s demon exposed the interplay between information and thermodynamics, the Gibbs paradox revealed a critical inconsistency in entropy calculations during the mixing of ideal gases [
5,
6]. Traditional resolutions attributed the paradox to the quantum indistinguishability of identical particles [
7], yet this explanation falters when extended to classical systems [
8,
9,
10], such as colloidal suspensions [
11,
12], where particles remain microscopically distinguishable. Consider a typical colloidal suspension [
12]: The system comprises approximately 10
12 micrometer-sized particles, each composed of roughly 10
12 atoms. Due to differences in shape and size [
13], no two colloid particles are exactly the same, making them classically distinguishable particles. Yet, surprisingly, the experimentally observed phase behavior matches perfectly with the statistical mechanics predictions based on the assumption of “identical particles” [
11,
12]. This apparent contradiction is particularly evident in colloid crystallization experiments. When the volume fraction of a monodisperse hard-sphere colloid suspension reaches a critical value, the system spontaneously crystallizes into an ordered structure. It has been argued that an entropy expression incorporating the correction factor
can describe this self-organization process [
12].
In the present study, we further demonstrate that the inclusion of
gives rise to the emergence of a zero-point energy, which may induce a distinct type of entropic force. An entropic force is not a fundamental interaction, but rather a macroscopic force of statistical significance [
14], arising from the maximization of entropy within a system. Since the entropic force points in the direction of maximum entropy, it exhibits self-organizing characteristics when acting on macroscopic systems. In this regard, it has already been well-established as a key driver of self-organization in soft matter systems [
2,
3,
4,
15,
16,
17,
18]. Unlike hard matter, soft matter exhibits more diverse and prevalent self-organization phenomena [
19,
20,
21,
22], extending even to social and biological systems [
23,
24,
25,
26]. In this context, entropic forces have been invoked to explain emergent properties of life [
27,
28] and intelligence [
29,
30]. Building on these insights, we propose a Boltzmann machine consisting of classically distinguishable colloidal particles. Through this model, we demonstrate that the zero-point energy induced by the factor
may serve as a mechanism driving the self-organizing evolution of colloidal systems in an environment of random fluctuations. In existing literature [
31], an atomic Boltzmann machine has been realized. By contrast, we propose a colloidal version in which colloidal particles are classically distinguishable, thereby serving as observable neurons.
The remainder of this paper is organized as follows.
Section 2 introduces the statistical-mechanical entropy framework, distinguishing between Gibbs and Boltzmann entropy formulations.
Section 3 expresses the Gibbs entropy in terms of macro-thermodynamic variables.
Section 4 expresses the Boltzmann entropy in terms of macro-thermodynamic variables.
Section 5 demonstrates how
induces an entropic force and examines its manifestations in ideal gases.
Section 6 proposes a colloidal version of the Boltzmann machine. Finally,
Section 7 concludes the paper.
2. Statistical-Mechanical Entropy
We begin by considering classically distinguishable particles obeying the Boltzmann distribution (see page 140 in [
32]):
with
and
, where
represents the occupation number of the
-th energy level
,
denotes the degeneracy of the
-th energy level,
and
are thermodynamic parameters.
In classical statistical mechanics, two fundamental entropy formulations emerge: the Gibbs entropy
and the Boltzmann entropy
. Using the Boltzmann distribution (1), these entropies can be expressed as [
32]:
and
where the microstate count
is given by:
The key difference between these two expressions lies in the
term, which serves as the correction factor resolving the Gibbs paradox. Historically, following Planck’s seminal work [
7], this term was interpreted as accounting for the indistinguishability of identical particles. This interpretation suggests that the Gibbs entropy (2) describes indistinguishable particles, whereas the Boltzmann entropy (3) applies to distinguishable particles. However, recent research advances in colloid science have revealed that the correction factor
remains crucial even for systems of classically distinguishable particles [
6,
8,
9,
10,
11,
12]. In the present study, we demonstrate that this correction factor gives rise to a distinct entropic force. To elucidate this, we need to express both entropy formulations in terms of four macroscopic thermodynamic variables: particle number, internal energy, temperature, and chemical potential. This can be accomplished using the Boltzmann distribution (1). Under this distribution, the particle number
and the internal energy
are given by:
3. Gibbs Entropy Picture
We first express the Gibbs entropy formula (2) in terms of macro-thermodynamic variables. Without loss of generality, we assume that the energy level
depend on volume
; that is,
for
. Based on this assumption, we combine equations (5) and (6) to obtain [
10,
30,
33,
34,
35,
36]:
A detailed derivation of equation (7) is provided in Appendix A of [
10].
On the other hand, when
, applying Stirling’s approximation
to equation (4) yields [
10]:
The detailed derivation of equation (8) is provided in Appendix B of [
10].
Equation (8) allows us to rewrite the Gibbs entropy formula (2) equivalently as:
Substituting the Gibbs entropy formula (9) into equation (7) gives:
The fundamental thermodynamic equation is well-known as [
37]:
where
denotes the chemical potential,
denotes the temperature,
denotes the pressure, and
denotes the thermodynamic entropy.
By comparing equations (10) and (11), we obtain:
Substituting equations (12) and (13) into the Gibbs entropy formula (9) yields:
which expresses the Gibbs entropy in terms of four macro-thermodynamic variables:
,
,
, and
.
The mathematical rigor of deriving equation (14) is justified in [
10]. In this paper, we refer to equation (14) as the
Gibbs entropy picture. Under this picture, the internal energy
is given by:
Furthermore, using equations (12) and (13), the Boltzmann distribution (1) under the Gibbs entropy picture takes the well-known form:
where
.
5. Entropic Force Arising from the Correction Factor
By comparing equations (15) and (24), we observe that the correction factor introduces a contribution of to the internal energy under the Gibbs entropy picture. In this section, we demonstrate that this term fundamentally arises from an entropic force.
As a macroscopic effective force, the entropic force is characterized by two features [
14]:
(A). The entropic force points in the direction of increasing entropy.
(B). The entropic force is proportional to the temperature
1.
To elucidate how the correction factor
induces such an entropic force, we analyze the system’s free energy:
In thermodynamic equilibrium, a system minimizes its free energy. We evaluate the free energy under both the Boltzmann and Gibbs entropy formulations.
Under the Boltzmann entropy picture, the free energy is expressed as:
where
is defined by equation (24).
Under the Gibbs entropy picture, the free energy becomes:
with
given by equation (15).
From equations (15) and (24), we derive the relationship:
allowing equation (28) to be rewritten as:
where
represents an effective entropy.
Since the term in equation (29) arises from the correction, a comparison between equations (27) and (30) reveals that including effectively increases the entropy by . This increase in entropy, which we refer to as an “effective entropy enhancement,” reduces the free energy relative to , with the magnitude of reduction proportional to the temperature . These observations align with the defining features of entropic forces, particularly their temperature dependence and role in increasing entropy, strongly suggesting that the term represents the work done by an entropic force. By lowering , this force drives the system toward a more thermodynamically stable state with minimized free energy.
Here, we demonstrate that for an ideal gas, the entropic force induced by the correction factor
manifests as repulsive interactions among particles. An ideal gas implies that the system is extensive, meaning that we have [
37]
By Euler’s theorem on homogeneous functions, equation (32) transforms the fundamental thermodynamic equation (11) into:
where,
,
, and
.
Substituting equation (33) into equation (15) yields:
which is the ideal gas law.
Equation (34) reveals that the term
generates a non-zero pressure among ideal gas particles at
. This pressure, which emerges from the
correction, represents a purely statistical entropic force that is fundamentally different from conventional interaction-based pressures. The critical role of this correction becomes evident when considering the Boltzmann entropy picture: Omitting
reduces the internal energy to that given by equation (24), which, when combined with equation (33), yields:
This null result confirms that the pressure in equation (34) is indeed an entropic force arising from the term.
At absolute zero temperature , equations (34) and (35) coincide, implying that the Gibbs entropy (2) and the Boltzmann entropy (3) have the same effect in this context. Absolute zero temperature represents the complete absence of thermal motion, and in the classical sense, it can be understood as particles being in a state of absolute rest. Therefore, the Boltzmann entropy (3) essentially describes classical particles, albeit these classical particles are “localized”—restricted to specific positions. Hence, we propose using the Boltzmann distribution under the Boltzmann entropy picture, i.e., equation (25), to describe localized, distinguishable particles.
The above analysis assumes ideal gases. We now extend our discussion to colloidal systems.
6. Colloidal Boltzmann Machine
In
Section 5, we established that the term
originates from an entropic force associated with the correction factor
. For an ideal gas, the entropic force manifests as repulsive interactions among particles, implying the incompressibility of the gas. This observation further suggests the presence of zero-point energy in the system. To verify that the correction factor
indeed leads to zero-point energy, we reformulate equation (15) under the Gibbs entropy picture as a partial differential equation:
Recognizing that and , equation (36) becomes equivalent to equation (15). Here, the term explicitly encodes the contribution of .
The general solution to the partial differential equation (36) is given by:
where
is a smooth function. The mathematical details for deriving this solution can be found in references [
33,
34,
35,
36]. We now demonstrate that equation (37) ensures the existence of zero-point energy.
Proof. Zero-point energy requires . Assuming , equation (37) yields . For , this expression becomes negative, violating the non-negativity of the entropy (). This contradiction confirms the necessity of zero-point energy. ■
Without the correction factor
, however, the zero-point energy would vanish
2. The proof above merely demonstrates the existence of the zero-point energy. Later, we will show that for systems at constant temperature, the zero-point energy can be derived.
To explore the physical significance of zero-point energy, we consider a Boltzmann machine composed of classically distinguishable colloidal particles, where each particle acts as a neuron. In this framework, the neural network’s internal energy
is expressed as a function of the colloidal state variables. Let
represent the state vector for
colloidal particles, where
denotes an “active” colloid and
indicates an “inactive” state
3. Within this representation, the internal energy
can be expanded as a Taylor series:
where
represents the weights and
the biases. Higher-order terms have been omitted in equation (38).
Prior work [
31] has employed atomic spin systems to emulate Boltzmann machines. Here, we propose a colloidal version in which colloidal particles are classically distinguishable, thereby serving as observable neurons. For instance, colloidal particles with sizes on the micrometer scale can be individually tracked. Thus, the activity of colloidal neurons can be directly monitored in experiments.
Since colloidal systems with negligible interactions obey the Boltzmann distribution (16), the joint probability distribution of
colloidal particles in the state vector
can be expressed as [
33,
36]:
where the entropy
is determined by the general solution (37).
The derivation of equation (39) is detailed in references [
33,
34,
35], with its mathematical rigor rigorously established in [
36]. To elucidate its physical significance, this equation can be interpreted as the information entropy formula
, which quantifies the uncertainty associated with the microscopic state
of the system.
We consider a colloidal system in thermodynamic equilibrium, which necessitates a constant temperature
. Under this condition, the general solution (37) reduces to the following form
4 [
33,
36]:
Applying Stirling’s approximation
, equation (40) becomes:
Since entropy
is non-negative, equation (41) directly leads to the inequality:
where the equality condition defines the zero-point energy of the constant-temperature system:
Equation (43) highlights that the zero-point energy arises from the collective behavior of colloidal particles. For instance, with a single particle (), , confirming its dependence on system size.
Substituting equation (41) into (39) yields a Boltzmann machine model
5 [
33]:
where the partition function
is given by:
Here denotes summation over all possible configurations of .
From equation (44), the conditional probabilities for particle states follow [
41]:
where
represents the states of all particles except the
-th one.
In the low-temperature limit (
), equation (46) reduces to a step function:
when
, and
otherwise. At this limit, the Boltzmann machine (44) converges to a Hopfield network, where the state of the
-th particle is governed by the McCulloch-Pitts rule:
Despite
6 (but
), for
, the zero-point energy
, given by equation (43), remains finite. Like large-scale molecules such as pollen grains, colloidal particles are subject to random fluctuations [
21,
22], such as Brownian motion. Thus, Brownian motion introduces stochastic state changes, but the Hopfield network ensures that each update monotonically decreases the system’s energy [
42]. This process halts when
approaches
, as mandated by equation (42). Consequently, the system reaches an equilibrium vector
to prevent
. Crucially,
requires at least one particle to remain active (
); otherwise,
, violating equation (42). Thus, the scenario where
for all
is forbidden. This robustness feature is analogous to the incompressibility observed in ideal gases, which is due to entropic forces that require
. For example, when
, the equilibrium vector
would become trivial, with
for all
, according to equation (43).
The convergence of the state vector
illustrates a self-organization process
7: under random fluctuations (such as Brownian motion), the system evolves from disorder (high-energy states) to order (low-energy equilibrium), with at least one active colloid. This mechanism can be applied to various fields of information technology, including information storage, intelligent coding, and cryptography. For example, due to the robustness of the equilibrium vector
, information remains unaltered and undestroyed under random shocks. By contrast, if the zero-point energy
vanishes, the equilibrium vector
would become trivial, with
for all
. In such a case, the applications would become infeasible.