Submitted:
28 May 2025
Posted:
29 May 2025
You are already at the latest version
Abstract
Keywords:
MSC: 37C99, 60A99, 80-01, 82B03
1. Introduction
- Plato: we can know through reason what exists independently of humans. This intelligible world, which exists without any observer, is the objective reality with which science deals.
- Aristotle: what we can know is only the part of the world with which we interact, that is, the part that is observable. This sensible world common to all is the objective reality with which science deals.
2. Thermodynamics and the First Two Principles
2.1. First Principle: Energy, Changes and Equilibrium

2.2. Second Principle: Asymmetry of Changes, Irreversibility and Clausius Entropy
2.3. The Role of the Observer
2.3.1. Free Expansion
- 1.
- If the information we have is limited to what we have just said, then states A and A’ are identical. No further operation or additional work is required to return to the initial state and a thermodynamic cycle is closed (Figure 3, top).
- 2.
- If we have the additional information that initially in state A, the left side of the container was filled with N particles of gas, while the right side was empty (this assumes that we have at our disposal a detector capable of quantifying the amount of gas), then A and A’ differ. An additional mechanical work is necessary to compress the gas by a factor 2 so that it returns into its initial compartment and the system is restored to state A. At constant temperature, this work dissipates at least of heat into the surroundings (Figure 3, middle).
2.3.2. Gibbs Paradox of Mixing
2.3.3. Incomplete Information
2.3.4. The Observer’S Choice
2.4. Information Is Energy: the Proof by Demons
3. Statistical Mechanics
3.1. The Pragmatic Approach of the Fundamental Postulate
3.2. The Ergodic Hypothesis
3.3. The Link with Thermodynamics
- 1.
- Is there a way, free from any a priori connection to thermodynamics, to derive a formula for entropy that allows us to define it ?
- 2.
- Is there a way, free from any a priori connection to thermodynamics, to predict the time evolution of a dynamical system of particles ?
- 3.
- Is there a way to define equilibrium not only as stationary (as in thermodynamics) but also as the stable state ?
3.3.1. Gibbs and Boltzmann Entropies
3.3.2. Towards Equilibrium and H-Theorem
3.3.3. The Paradox of the Stability of Equilibrium
3.4. Gibbs Paradox of Joining Two Volumes of Gas
-
Quantum mechanics solution: according to this point of view it is not possible to understand classically the difference between Equations (10) and 14, the solution is quantum ([47] p.141). The number of possibilities for the joined and disjoined states are both overestimated because identical particles cannot be distinguished. In the disjoined case, the permutations of a combination of particle-positions must be counted as 1. Equation (12) must be replaced by:and similarly, Equation (13) must be replaced by:So that these corrections give:instead of equation 14. This argument is known as “correct” Boltzmann counting, even though it is no more correct than the following one.
-
Classical mechanics solution: The number of possibilities in the disjoined state is actually underestimated in Equation (12) [46,49,50]. Because when partitioning, a particle can end up in either of the two compartments leading to as much possibilities which are not counted in Equation (12). The correct calculation should be increased by a factor accounting for the different possible combinations of partitioning N particles in two sets of each:
4. Shannon’S Information Theory
4.1. Quantity of Information
4.2. Maximum Shannon Entropy Theorem (Maxent)
- Maximum Shannon entropy theorem (MaxEnt):
- The best distribution that maximizes the uncertainty on , while accounting for our knowledge, is the one that maximizes Shannon entropy.
4.3. Maximum Entropy Principle
- Maximum Shannon entropy principle:
- The equilibrium of a system is the only state that maximizes Shannon entropy of variables whose distributions are similarity-invariant in form.
4.4. Representationalism
4.5. Statistical Mechanics Becomes Probabilistic Mechanics
4.6. Gibbs Paradoxes
- Clausius entropy (Equation (3)) concerns a system about which we have incomplete information.
4.7. Brillouin Negentropie Principle of Information and Demons
5. Landauer Erasure Principle
5.1. Landauer’S Derivation of His Principle
- 1.
- The concrete implementation of a bit of memory is necessarily achieved via a physical system which must exhibit two stable thermodynamic states of equilibrium denoted 0 and 1, e.g., a particle in a bistable potential or a particle in a box divided in two.
- 2.
- The procedure for erasing (the erasure) a bit of memory, say setting the bit-value to state 0, must be independent of its initial value (0 or 1), because it must be able to work for an unknown value.
- 3.
- The two different paths that take the bit from state 0 or from state 1 to state 0 necessarily merge into one, at a point (called state S) where the value of the bit is undetermined. Before this point, both paths are judged necessarily uncontrolled and thermodynamically irreversible, because in the reverse direction, state S is a bifurcation, which is assumed not to be tractable for a controlled deterministic procedure. The consequence is that neither work nor heat can be obtained from the system during this step between state 0 or 1 and state S.
- 4.

5.2. Landauer Erasure Vs. Brillouin Negentropy
6. Invalidation of Landauer Erasure Principle
6.1. Piece of Information Versus Data-Bit
6.2. (Ir)Recoverable Loss of Information Versus Thermodynamic (Ir)Reversibility
6.3. Logical Irreversibility: Equivocal Operation or Irrecoverable Loss of Information ?
- 1.
-
At first, “We shall call a device logically irreversible if the output of a device does not uniquely define the inputs” [72]. “[erase] is an example of a logical truth function which we shall call irreversible” [72].Which means that “logically irreversible” functions applied to a particular data-bit are those that are equivocal, regardless of whether or not they cause an irrecoverable loss of information.
- 2.
-
But in the same paper: “At first sight it may seem that logical reversibility is simply obtained by saving the input in some corner of the machine. We shall, however, label a machine as being logically reversible, if and only if all its individual steps are logically reversible.” [72] And some years later: “Computation that preserves information at every step along the way (and not just by trivially storing the initial data) is called reversible computation” [71]. “It is easy to render any computer reversible in a rather trivial sense, by having it save all the information it would otherwise throw away” [74].From which we can legitimately conclude that logical irreversibility is linked to the irrecoverable loss of information that can be avoided by making copies.
6.4. Known Versus Unknown Value of a Data-Bit
6.5. On the Thermodynamic Reversibility of Landauer Erasure
- 1.
- From state 0 or 1 to state S: remove the partition.
- 2.
- From state S to state 0: perform an isothermal (thermodynamically reversible) compression with a piston.
- 3.
- From state 0 to state S: use the piston that is already in position to do an isothermal expansion (i.e., reverse step 2 of Landauer erasure).
- 4.
- From state S to unknown-state: put back the partition, so that the final state is random and unknown.
- 4.
- From state S to the initial state (0 or 1): use a piston to push the particle into the proper compartment.
6.6. On the Thermodynamic Reversibility of Alternative Erasures
6.6.1. States Incompletely Defined for Thermodynamics
6.6.2. Managing Two Possibilities with a Single Externally Imposed Procedure
6.7. Supposed Information-Mass Equivalence
- 1.
-
The idea that data-bits necessarily store information.In reality, information is necessarily stored under the form of data-bits but data-bits (even those set to a particular value according to a given acquired piece of information) do not necessarily store information. Copies do not. Information is not materialized by data-bits. This point has been developed extensively in Section 6.1.
- 2.
-
The idea that Brillouin negentropy principle and Landauer erasure principle are the same (see Section 5.2) or that the two can be amalgamated: “The mass of a bit of information and the Brillouin’s principle” [80] in 2014 becomes 6 years later: “We shall first analyze, in the context of general relativity, the consequences derived from the fact, implied by Landauer principle, that information has mass” [76]In reality, the Brillouin principle of negentropy tells us that a minimum amount of work is needed to reduce the uncertainty we have about the system, that is, to acquire one bit of information.Whereas Landauer erasure principle, for its part, claims that a data-bit must receive a minimum of energy to be erased.It follows that the two are only compatible in the context of acquiring information about a dynamical system, an acquisition which is carried out cyclically and which involves the Landauer erasure of the data-bits which were used to store outdated information, i.e., those of the previous cycles. Outside this context, the two are incompatible. We currently have no indication that the dynamic behavior of the Universe is cyclic, but if it were, one thing is certain: all the information we have was acquired during the same cycle.
- 3.
-
The idea that information cannot be outdated (then it would no longer be information) so that a data-bit stores information forever, which is closely related to the idea that a piece of information is a self-sufficient entity and that there is no need to specify information about what: “To test the hypothesis [the mass-energy-information equivalence principle] we propose here an experiment, predicting that the mass of a data storage device would increase by a small amount when is full of digital information relative to its mass in erased state. For 1Tb device the estimated mass change is Kg.” (Vopson [81]).In reality, information about old cycles of a Szilard engine is of no use. It is outdated. It is for this reason that the demon can erase the data-bit used for its storage.Let us imagine a Szilard demon who would store all the information about each cycles until its hard-disk is full. According to the Brillouin negentropy principle, the cost of acquiring information is paid cycle-by-cycle, so that the net energy balance between work produced and the cost of information is always zero at the end of each cycle. According to Landauer’s erasure principle the information cost will be never paid until the hard-disk is erased. But here there is a real inconsistency: does the filled hard drive contain more energy and have a greater mass (according to a misunderstanding of the Brillouin negentropy principle) or does it need energy to be erased, so that in the end once erased the mass is greater (according to Landauer erasure)? In fact, neither.
- 4.
-
The idea that the work required to acquire information is stored as energy in the data-bit, just as mechanical work exerted on a body can be stored within it as potential energy, but actually under the form of rest mass [91,92].In reality, at constant temperature, when a system received work that decreases its entropy (that reduces the uncertainty we have, or that reduces the information we lack about it), this work is not stored as internal energy, it is dissipated as heat in the surroundings. A volume of gas that can be compressed or expanded isothermally with a piston behaves in exactly the same way as if it were a spring. But the latter stores potential energy (thus increases its rest mass), while the former does not. Actually for the spring, the potential energy at the macroscopic scale originates from the interaction potential between particles. In a perfect gas, there is no interaction potential, except that of colliding hard spheres. The force exerted on the piston is an emergent property.
Conclusion
Statements
References
- Descartes, R. Meditations on First Philosophy; Oxford World’s Classics Ser., Oxford University Press USA - OSO: Oxford, 2008.
- Carter, R. Descartes’ medical philosophy : the organic solution to the mind-body problem; The Johns Hopkins University, 1931.
- Mach, E. The Science of Mechanics; The Open Court Publishing Company, 1919.
- Duhem, P. The Aim and Structure of Physical Theory; Princeton University Press, 2021. [CrossRef]
- Einstein, A. On the method of theoretical physics. Philosophy of Science 1934, 1, 163–169. [CrossRef]
- Einstein, A.; Podolsky, B.; Rosen, N. Can Quantum-Mechanical Description of Physical Reality Be Considered Complete? Physical Review 1935, 47, 777–780. [CrossRef]
- Maxwell, J.C. Theory of heat, 3d ed. ed.; Longmans, Green and Co.: London, 1872.
- Planck, M. Treatise of thermodynamics; Longmans, Green and Co., 1903.
- Maxwell, J.C. Diffusion. Encyclopedia Britannica, reproduced in Scientific papers 1878, 2, 625–646. [CrossRef]
- Joule, J. On the mechanical equivalent of heat. Philosophical Transactions of the Royal Society of London 1850, 140, 61–82.
- Feynman, R.P.; Leighton, R.B.; Sands, M. The Feynman lectures on physics; Addison-Wesley, Reading, MA, 1966; chapter 4.
- Hecht, E. Understanding energy as a subtle concept: A model for teaching and learning energy. American Journal of Physics 2019, 87, 495–503. [CrossRef]
- Clausius, R. The mechanical theory of heat; Macmillan & Co, London, UK, 1879.
- Jaynes, E.T. The Gibbs Paradox. In Maximum Entropy and Bayesian Methods; Springer Netherlands, 1992; pp. 1–21. [CrossRef]
- Quine, W.V. The ways of paradox, and other essays; Harvard University Press: Cambridge, Massachusetts, 1976.
- Gibbs, J.W. On the equilibrium of heterogeneous substances : first [-second] part; Connecticut academy of arts and sciences, 1874. [CrossRef]
- Lairez, D. Thermostatistics, Information, Subjectivity: Why Is This Association So Disturbing? Mathematics 2024, 12, 1498. [CrossRef]
- van Kampen, N. The Gibbs Paradox. In Essays in Theoretical Physics; Elsevier, 1984; pp. 303–312. [CrossRef]
- Szilard, L. On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings. Behavioral Science 1964, 9, 301–310. [CrossRef]
- Darrigol, O. Boltzmann’s reply to the Loschmidt paradox: a commented translation. The European Physical Journal H 2021, 46. [CrossRef]
- Bennett, C.H. Demons, Engines and the Second Law. Scientific American 1987, 257, 108–116. [CrossRef]
- Liboff, R.L. Maxwell’s demon and the second law of thermodynamics. Foundations of Physics Letters 1997, 10, 89–92. [CrossRef]
- Leff, H.S. Maxwell’s Demon and the Second Law. In Proceedings of the AIP Conference Proceedings. AIP, 2002, Vol. 643, pp. 408–419. [CrossRef]
- Kieu, T.D. The Second Law, Maxwell’s Demon, and Work Derivable from Quantum Heat Engines. Physical Review Letters 2004, 93, 140403. [CrossRef]
- Radhakrishnamurty, P. Maxwell’s demon and the second law of thermodynamics. Resonance 2010, 15, 548–560. [CrossRef]
- Ciliberto, S., Landauer’s Bound and Maxwell’s Demon. In Information Theory; Springer International Publishing, 2021; pp. 87–112. [CrossRef]
- Fontana, P.W. Hidden Dissipation and Irreversibility in Maxwell’s Demon. Entropy 2022, 24, 93. [CrossRef]
- Gibbs, J. Elementary principles in statistical mechanics; Charles Scribner’s sons, 1902.
- Balian, R. From microphysics to macrophysics; Springer Berlin Heidelberg, 1991. [CrossRef]
- Dubs, H. The principle of insufficient reason. Philosophy of Science 1942, 9, 123–131. [CrossRef]
- Keynes, J. A treatise on probability; Macmillian, 1921.
- Ellis, R. Remarks on an alleged proof of the “Method of least squares,” contained in a late number of the Edinburgh review. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science 1850, 37, 321–328. [CrossRef]
- Lairez, D. A short derivation of Boltzmann distribution and Gibbs entropy formula from the fundamental postulate. arXiv 2022, [2211.02455v3].
- Tien, C.L.; Lienhard, J. Statistical thermodynamics; Hemisphere Pub. Corp., 1979.
- Sekerka, R. Thermal Physics; Elsevier, 2015. [CrossRef]
- Boltzmann, L. Lectures on gas theory; Dover ed., New York, NY, USA, 1964.
- Maxwell, J.C. On the dynamical theory of gases. Philosophical Transactions of the Royal Society of London 1867, 157, 49–88. [CrossRef]
- Cercignani, C. The Boltzmann Equation and Its Applications; Springer New York, 1988. [CrossRef]
- Villani, C. H-Theorem and beyond: Boltzmann’s entropy in today’s mathematics. In Boltzmann’s Legacy; EMS Press, 2008; pp. 129–143. [CrossRef]
- Weaver, C. In Praise of Clausius Entropy: Reassessing the Foundations of Boltzmannian Statistical Mechanics. Foundations of Physics 2021, 51. [CrossRef]
- Weaver, C. Poincaré, Poincaré recurrence and the H-theorem: A continued reassessment of Boltzmannian statistical mechanics. International Journal of Modern Physics B 2022, 36. [CrossRef]
- Villani, C. Chap. 2 - A review of mathematical topics in collisional kinetic theory. In Handbook of Mathematical Fluid Dynamics; Friedlander, S.; Serre, D., Eds.; North-Holland, 2002; Vol. 1, Handbook of Mathematical Fluid Dynamics, pp. 71–74. [CrossRef]
- Uffink, J. Compendium of the foundations of classical statistical physics. In Philosophy of Physics; Butterfield, J.; Earman, J., Eds.; Handbook of the Philosophy of Science, North-Holland: Amsterdam, 2007; pp. 923–1074. [CrossRef]
- Callen, H.B. Thermodynamics and an introduction to thermostatistics, 2nd ed.; J. Wiley & sons, 1985.
- Villani, C. (Ir)reversibility and Entropy. In Time; Springer Basel, 2012; pp. 19–79. [CrossRef]
- Ray, J.R. Correct Boltzmann counting. European Journal of Physics 1984, 5, 219–224. [CrossRef]
- Huang, K. Statistical Mechanics, 2nd ed.; J. Wiley & sons, 1991.
- Nagle, J.F. Regarding the Entropy of Distinguishable Particles. Journal of Statistical Physics 2004, 117, 1047–1062. [CrossRef]
- Cheng, C.H. Thermodynamics of the System of Distinguishable Particles. Entropy 2009, 11, 326–333. [CrossRef]
- Versteegh, M.A.M.; Dieks, D. The Gibbs paradox and the distinguishability of identical particles. American Journal of Physics 2011, 79, 741–746. [CrossRef]
- Frenkel, D. Why colloidal systems can be described by statistical mechanics: some not very original comments on the Gibbs paradox. Molecular Physics 2014, 112, 2325–2329. [CrossRef]
- Dieks, D. The Gibbs paradox and particle individuality. Entropy 2018, 20, 466. [CrossRef]
- Saunders, S. The Gibbs Paradox. Entropy 2018, 20, 552. [CrossRef]
- van Lith, J. The Gibbs paradox: lessons from thermodynamics. Entropy 2018, 20, 328. [CrossRef]
- Lairez, D. Plea for the use of the exact Stirling formula in statistical mechanics. SciPost Physics Lecture Notes 2023. [CrossRef]
- Crocker, J.C.; Hoffman, B.D., Multiple-Particle Tracking and Two-Point Microrheology in Cells. In Cell Mechanics; Elsevier, 2007; pp. 141–178. [CrossRef]
- Pinto, S.C.; Vickers, N.A.; Sharifi, F.; Andersson, S.B. Tracking Multiple Diffusing Particles Using Information Optimal Control. In Proceedings of the 2021 American Control Conference (ACC). IEEE, 2021, pp. 4033–4038. [CrossRef]
- Xu, X.; Wei, J.; Sang, S. Deep learning-based multiple particle tracking in complex system. AIP Advances 2024, 14. [CrossRef]
- Peters, H. Demonstration and resolution of the Gibbs paradox of the first kind. European Journal of Physics 2013, 35, 015023. [CrossRef]
- Shannon, C.E. A mathematical theory of communication. The Bell System Technical Journal 1948, 27, 379–423. [CrossRef]
- Hartley, R.V.L. Transmission of information. Bell System Technical Journal 1928, 7, 535–563. [CrossRef]
- Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620–630. [CrossRef]
- Shore, J.; Johnson, R. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions on Information Theory 1980, 26, 26–37. [CrossRef]
- Jaynes, E.T. The well-posed problem. Foundations of Physics 1973, 3, 477–492. [CrossRef]
- Brillouin, L. The Negentropy Principle of Information. Journal of Applied Physics 1953, 24, 1152–1163. [CrossRef]
- Brillouin, L. Science and Information Theory; Dover Publications: Mineola, N.Y., 1956.
- Ciliberto, S. Landauer’s bound and Maxwell’s demon. Séminaire Poincaré 2018, L’Information, XXIII, 79–102.
- Penrose, O. Foundations of statistical mechanics. Reports on Progress in Physics 1979, 42, 1937–2006. [CrossRef]
- Denbigh, K.; Denbigh, J. Entropy in Relation to Incomplete Knowledge; Cambridge University Press, 1985. [CrossRef]
- Lavis, D.A., Frontiers in Fundamental Physics. In Frontiers in Fundamental Physics; Sidarth, B.G., Ed.; Universities Press, India,, 2007; Vol. vol 3, chapter Equilibrium and (Ir)reversibility in Classical Statistical Mechanics.
- Landauer, R. Information is Physical. Physics Today 1991, 44, 23–29. [CrossRef]
- Landauer, R. Irreversibility and Heat Generation in the Computing Process. IBM Journal of Research and Development 1961, 5, 183–191. [CrossRef]
- Landauer, R. The physical nature of information. Physics Letters A 1996, 217, 188–193. [CrossRef]
- Bennett, C.H. The thermodynamics of computation - a review. International Journal of Theoretical Physics 1982, 21, 905–940. [CrossRef]
- Bennett, C.H. Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 2003, 34, 501–510. [CrossRef]
- Herrera, L. Landauer Principle and General Relativity. Entropy 2020, 22, 340. [CrossRef]
- Lutz, E.; Ciliberto, S. Information: from Maxwell’s demon to Landauer’s eraser. Physics Today 2015, 68, 30–35. [CrossRef]
- Bormashenko, E. The Landauer Principle: Re-Formulation of the Second Thermodynamics Law or a Step to Great Unification? Entropy 2019, 21, 918. [CrossRef]
- Witkowski, C.; Brown, S.; Truong, K. On the Precise Link between Energy and Information. Entropy 2024, 26, 203. [CrossRef]
- Herrera, L. The mass of a bit of information and the Brillouin’s principle. Fluctuation and Noise Letters 2014, 13, 1450002. [CrossRef]
- Vopson, M.M. The mass-energy-information equivalence principle. AIP Advances 2019, 9, 095206. [CrossRef]
- Vopson, M.M. The information catastrophe. AIP Advances 2020, 10. [CrossRef]
- Džaferović-Mašić, E. Missing information in the Universe as a dark matter candidate based on the mass-energy-information equivalence principle. Journal of Physics: Conference Series 2021, 1814, 012006. [CrossRef]
- Vopson, M.M. Experimental protocol for testing the mass-energy-information equivalence principle. AIP Advances 2022, 12, 035311. [CrossRef]
- Ciliberto, S.; Lutz, E. The physics of information: from Maxwell to Landauer. In Energy Limits in Computation; Springer International Publishing, 2018; pp. 155–175. [CrossRef]
- Lairez, D. The Fundamental Difference Between Boolean Logic and Thermodynamic Irreversibilities, or, Why Landauer’s Result Cannot Be a Physical Principle. Symmetry 2024, 16, 1594. [CrossRef]
- Plenio, M.B.; Vitelli, V. The physics of forgetting: Landauer’s erasure principle and information theory. Contemporary Physics 2001, 42, 25–60. [CrossRef]
- Maroney, O. The (absence of a) relationship between thermodynamic and logical reversibility. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 2005, 36, 355–374. [CrossRef]
- Lairez, D. Thermodynamical versus Logical Irreversibility: A Concrete Objection to Landauer’s Principle. Entropy 2023, 25, 1155. [CrossRef]
- Lairez, D. On the Supposed Mass of Entropy and That of Information. Entropy 2024, 26, 337. [CrossRef]
- Brillouin, L. The actual mass of potential energy, a correction to classical relativity. PNAS 1965, 53, 475–482. [CrossRef]
- Brillouin, L. The actual mass of potential energy II. PNAS 1965, 53, 1280–1284. [CrossRef]
- Rosenblum, B.; Kuttner, F. Foundations of Physics 2002, 32, 1273–1293. [CrossRef]
- Feynman, R. The Feynman lectures on physics vol. I, chap. 15: The special theory of relativity, 1963.
- Bini, D., Observers, Observables and Measurements in General Relativity. In General Relativity, Cosmology and Astrophysics; Springer International Publishing, 2014; pp. 67–90. [CrossRef]








| Fixed length encoding | Variable length encoding |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).