Preprint
Article

This version is not peer-reviewed.

Szilard’s Demon: Information as a Physical Quantity?

Submitted:

12 December 2025

Posted:

15 December 2025

You are already at the latest version

Abstract

The historical context of Szilard’s thought experiment is considered: Maxwell’s demon, Brownian motion, and naturalization of Maxwell’s demon by Smoluchowski. After that, the discussions of Szilard’s thought experiment in the second half of the 20th century are described: the penetration of information into statistical mechanics, the works of Brillouin, Bennett (Landauer’s principle) and Zurek. The second part of the paper is devoted to the criticism of thermodynamics of information. The critique of Earman and Norton is extended by considering levels of organization. Also, the problem of coordination with respect to information is discussed.

Keywords: 
;  ;  ;  

Introduction

Leo Szilard is known for his thought experiment, which played a large role in attempts by physicists to tie information with physical processes in the second half of the 20th century. The device proposed by Szilard can be called Maxwell’s information demon and, in this paper, it is called Szilard’s demon. The expression for information entropy in Shannon’s theory turned out to be similar for thermodynamic entropy in statistical mechanics. Szilard’s demon provided an additional reason for asserting the equivalence of information and thermodynamic entropy. This way, the information was considered a physical quantity.
Szilard’s idea was a source of inspiration for a few generations of physicists. Moreover, these ideas have played a role in setting up experiments with Brownian particles in the 21st century [1]. However, the stochastic thermodynamics formalism created to describe these experiments differs significantly from previous discussions of Szilard’s demon. Also, there are good arguments from Earman and Norton [2,3] against information as a physical quantity. In this paper, they are extended by considering levels of organization. The claim that a physical device is needed to transmit information and perform computations is trivial. Nevertheless, the question remains whether it is possible to combine the information level with the physical one.
In the first part, the historical context of Szilard’s thought experiment and its discussions in the second half of the 20th century are considered. The second part is devoted to a critical analysis of these ideas.

History of Thermodynamics of Information

Maxwell’s Demon and Brownian Motion

In a letter to Peter Tait in 1867, Maxwell described a device containing a tiny creature that William Thomson (Lord Kelvin) dubbed a demon [4]. Maxwell wanted to prove that it is impossible to deduce the second law from the laws of mechanics. Maxwell explicitly expressed this idea in a letter in 1870 to John Strutt (Lord Rayleigh), where he emphasized that the laws of mechanics are time symmetric. Thus, it is entirely possible to transfer heat from a cold body to a hot one by reversing time in the laws of mechanics. But since such an action was impossible to implement, the thought experiment with the demon made the argument more plausible.
Maxwell’s demon appeared publicly in 1871 in the book ‘Theory of Heat’. Maxwell’s ideas were confirmed by Ludwig Boltzmann’s statistical foundation of the second law in 1877. However, in the 19th century, no one thought to fight Maxwell’s demon - the second law is of statistical nature, hence its random violation is possible.
The study of Brownian motion changed the attitude of physicists to Maxwell’s demon. After the theory developed by Einstein, Smoluchowski and Langevin, experiments were carried out that confirmed the presence of fluctuations. During the study of Brownian motion, the possibility of a second-kind perpetual motion machine based on fluctuations was discussed. Here is a quote from Gelfer’s book [5]:
‘There was a debate about Thomson’s statement (“a second-kind perpetual motion machine is impossible”). Some physicists, such as Lippmann, Svedberg, and Ostwald, believed that fluctuation phenomena allowed the possibility of a second-kind perpetual motion machine, at least in principle. It seemed that Maxwell’s idea of demons sorting molecules based on their velocities found theoretical support in Brownian motion. Gouy had once suggested that if the Brownian motion could be organized in some way, it would open the possibility of obtaining free energy. While Gouy’s suggestion was a hypothetical proposal, Ostwald explicitly stated in 1906 that the second law could be disproved:
“It seems to us that Maxwell’s ‘demons’, which in the molecular domain could be considered harmless, have an open field for experimental refutation of the second law in the finite domain of visible phenomena.”
Specific schemes for implementing a second-kind perpetual motion machine were even proposed, most of which were based on one or another variant of Maxwell’s demons.’

Smoluchowski: Naturalization of Maxwell’s Demon

Let us consider the position of Marian Smoluchowski [2], whose paper titles were quite provocative: ‘Experimentally Demonstrable Molecular Phenomena that Contradict Ordinary Thermodynamics‘ (1912), ‘The Limits of the Second Law of Thermodynamics‘ (1914).
In classical thermodynamics, strictly speaking, fluctuations are impossible. In this sense, Brownian motion and the presence of fluctuations contradict the second law of thermodynamics. However, after discussing this fact, Smoluchowski naturalized Maxwell’s demon [2]. The demon is declared to be a device that obeys the laws of physics and is subject to fluctuations. With this, Smoluchowski demonstrated the problematic nature of proposed second-kind perpetual motion machines, as the presence of fluctuations in Maxwell’s demon makes it impossible for these machines to operate continuously. To preserve the second law, the formulation must be slightly modified to consider fluctuations [2]:
‘There can be no automatic device that would produce continuously usable work at the expense of the lowest temperature.’
Smoluchowski’s solution was successful. A perpetual motion machine of the second kind that operates for an extended period has not been created yet. Let me quote from a book with the expressive title ‘Challenges to the Second Law of Thermodynamics‘ [6] published in 2005; it examines many proposals for a perpetual motion machine of the second kind:
‘In this volume we will attempt to remain clear on this point; that is, while the second law might be potentially violable, it has not been violated in practice.’

Szilard’s Demon

In 1925, Szilard wrote a paper on a phenomenological theory of fluctuations [2]. A description of the development of these ideas by Szilard can be found in the book by Y. G. Rudoi, “Mathematical Structure of Equilibrium Thermodynamics and Statistical Mechanics” [7]. The proposed devices discussed by Smoluchowski and subject to the formalism of Szilard’s fluctuations can be referred to as mechanical demons. They, however, did not contain information in the direct form, and thus the possibility of correlations between fluctuations in the demon and in the system has not been explored.
This seemed to be the motivation for the 1929 paper ‘On the Decrease of Entropy in a Thermodynamic System by the Intervention of Intelligent Beings‘. Szilard considered a thought experiment that influenced the development of physics in the second half of the 20th century. Szilard reduced the system in question to a single molecule and introduced a being that, after detecting molecule location in the left or right part of the volume, used this information to obtain useful work. During one cycle of the controlled subsystem, heat is converted into work and thus the entropy of the subsystem is reduced. This showed the potential role of information about fluctuations in generating work.
The mechanical demon is transformed into the information demon — measurement, information processing, and action. In this paper, the term Szilard’s demon is used to emphasize the difference between this device and Maxwell’s mechanical demons; the Szilard demon is an information-based version of Maxwell’s demon. Szilard assumed that the second law of thermodynamics would not be violated in the whole system due to the increase in entropy during measurement of the molecule position. Thus, thermodynamic entropy was associated with the process of obtaining information about a subsystem during measurement.

Transfer of Information from Shannon’s Theory to Statistical Mechanics

Shannon’s theory of information is related to the problem of transmitting efficiently a text consisting of characters over a noisy communication channel. The text contains information, but the amount of information is not related to the content but rather to the length of the text. The characters from a given alphabet are encoded in binary, and this leads to the concept of a bit, a unit of measurement for information.
Information entropy has played a major role in solving the problems of telecommunications - the choice of the most effective encoding of a symbol, the creation of noise-resistant codes, as well as the text compression for more efficient storage and transmission. In this respect, the ambiguity of the relationship between information entropy and the amount of information should be mentioned. On the one hand, entropy is associated with ignorance (the greater the entropy, the less information), on the other hand - in the compressed files, the maximum amount of information is achieved at the limit of maximum entropy. The paper ‘The many faces of Shannon information’ [8] gives a good introduction to Shannon’s information theory, and the section ‘About the concept of Shannon information’ perfectly conveys the content of Shannon’s theory.
The mathematical expression for information entropy is similar to the Gibbs statistical entropy, and many physicists have concluded that this similarity indicates a relationship between thermodynamic entropy and information entropy (see the paper [8] after the section ‘About the concept of Shannon information’). Three people played a major role in spreading ideas about the relationship between entropy and information (below information is from Anta’s thesis [9]): Norbert Wiener, John von Neumann, and Warren Weaver.
In 1948, in the influential book ‘Cybernetics‘, Wiener connected the concepts of information and entropy. A cybernetic agent used the information received to predict the behavior of a system, which in turn was associated with order or disorder; predicting the behavior of an ordered system was easier. Wiener drew on Schrödinger’s ‘negative entropy’ in the book ‘What is life‘, as well as on the chemist Gilbert Lewis’s arguments, who had argued back in 1930: ‘Gain in entropy always means loss of information, and nothing more.’
Thus, Wiener introduced a connection between the information possessed by a cybernetic agent and the entropy of the system that the agent controls:
‘The notion of the amount of information attaches itself very naturally to a classical notion in statistical mechanics: that of entropy. Just as the amount of information in a system is a measure of its degree of organization, so the entropy of a system is a measure of its degree of disorganization; and the one is simply the negative of the other (…) We have said that amount of information, being the negative logarithm of a quantity [μ(ΓM)] which we may consider as a probability, is essentially a negative entropy.’
In 1932, John von Neumann discussed in his book ‘Mathematical Foundations of Quantum Mechanics‘ the solution proposed by Leo Szilard in his thought experiment. Von Neumann believed that such a solution allows us to combine together the physical state of the system and the epistemic state of the agent. It could be that, along this way, von Neumann hoped to find an interpretation of measurement in quantum mechanics. After the works of Shannon and Wiener, von Neumann actively promoted the combination of formal logic and statistical mechanics by means of the connection between information and entropy. Von Neumann’s authority in academic circles contributed to the spread of this idea.
Warren Weaver worked together with Shannon, but Weaver, unlike Shannon, wanted to popularize information theory in a way that is accessible to an educated public. Weaver extended the meaning of information in Shannon’s theory to semantic and pragmatic and connected information entropy with thermodynamic entropy. Weaver’s administrative resources ensured the success of his venture.

Brillouin, Landauer and Thermodynamics of Computation

The first physicist to unify Shannon’s information theory and statistical mechanics was Léon Brillouin (1889-1969). In a series of papers from 1951, he analyzed Maxwell’s information demons, including Szilard’s thought experiment [3,9], and in 1956, he published the book ‘Science and Information Theory’ [10].
Brillouin believed that the best expression for energy degradation would be a negative value of entropy, which he called negentropy. At the same time, in his book negentropy is defined formally as a negative value of entropy:
‘Negentropy (N = —S) represents the quality or grade of energy, and must always decrease.’
Next, Brillouin introduces the negentropy principle of information. He equates the information change with the change in the number of microstates in the Boltzmann equation. From this he concludes:
‘bound information = decrease in entropy S = increase in negentropy N’
Technically, Brillouin’s consideration was only about changes in entropy, but the change in entropy was sometimes called the change in negentropy and sometimes as bound information. Brillouin used the negentropy principle of information to analyze Szilard’s demon. In his analysis, he confirmed the role of measurements; the entropy decrease of the controlled subsystem during the demon operation is compensated by the entropy increase in the whole system, including the demon, during the measurement process. Brillouin’s papers and the book led to a consensus among physicists for the next couple of decades. The acquisition of new information during measurements is associated with the change in thermodynamic entropy, and this was considered the solution to expel Szilard’s demon.
In parallel, there was a discussion about the minimum costs associated with computations [3,9]. For certain logical operations, it was impossible to return to the initial state, and these logical operations were referred to as irreversible. In 1961, Rolf Landauer connected logical irreversibility with physical irreversibility and analyzed the operation of writing a single bit to memory. In this process, there is no verification of the memory state, as it would require additional costs. Therefore, the operation of writing a bit is irreversible, as it makes the previous content unknown (memory erasure). As a result, Landauer proposed the principle that such an operation must be accompanied by the release of a minimum amount of heat associated with a change in thermodynamic entropy.
Later, Charles Bennett [3,9] and others showed that it is possible to run computations by means of reversible logical operations only. Thus, memory erasure remained the only logical irreversible operation. As a result in the early 1980s, Bennett proposed a new analysis of Szilard’s demon, concluding that measurements can be performed without the entropy change, and thus associating the necessary increase in entropy with memory erasure. According to Bennett, the demon would have to write the measurement result to memory, and at the end of the cycle, the memory cell would have to be initialized by memory erasure. The new interpretation of Szilard’s demon prevailed, although there was additional discussion about the reasons for a mistake made by the previous generation of physicists, led by Brillouin.
In the late 1980s, Wojciech Zurek provided the final touch to this story. Zurek pointed out that Bennett’s entropy balance is only established at the end of the process, and he suggested expanding Bennett’s analysis to include algorithmic entropy. Zurek proposed that Szilard’s demon uses a specific algorithm based on reversible computations to perform the task. By incorporating algorithmic entropy into the analysis, Zurek has shown that the entropy balance remains constant throughout the process.

Critique of Thermodynamics of Information

Szilard did not discuss the structure of the demon, and later the demon was considered a cybernetic device (measurement, processing of information, action). The device is physical and hence it opens a possibility to find a place for information among physical quantities. It is assumed that information is related to entropy, and this leads to thermodynamics of information.
For simplicity, let us assume that Szilard’s demon is a mechanical system. The Fredkin-Toffoli billiard computer (ballistic reversible computing) is often used as an example of reversible computation, so a mechanical demon should be a plausible approach to discuss this issue.
Thus, from the viewpoint of statistical mechanics, the demon and the controlled subsystem could be considered as a single Hamiltonian, which dynamics includes information processing and calculations inside the demon as well as the subsystem controlled by the demon. The goal is related to the second law: the entropy of an isolated system cannot decrease, or a perpetual motion machine of the second kind is impossible.

What Does Szilard’s Demon Prove?

Maxwell wanted to demonstrate that it is impossible to find strict mathematical proof of the second law in the kinetic theory. The laws of mechanics are time-symmetric, and the statistical interpretation of the second law does not prohibit the existence of Maxwell’s demon. Smoluchovsky’s weak form of the impossibility of a perpetual motion machine of the second kind reflects this statistical interpretation. Thus, the question would be what is new in the analysis of Szilard’s demon as compared with Smoluchowski’s conclusion [3].
In the information analysis of Szilard’s demon, the second law is taken as the initial postulate, and it is assumed that the second law must be fulfilled in one engine cycle. Thus, instead of Smoluchowski’s weak form, a strong form of the second law from classical thermodynamics is used. Otherwise, it would be impossible to obtain equality between information and thermodynamic entropy. The second law is considered true a priori, and the subsequent analysis comes down to a heuristic search for the missing part in the entropy balance. It is unclear whether such an approach could be considered as proof that Szilard’s demon does not break the second law [3].
By Brillouin’s analysis, the internal structure of the demon is ignored, and the missing part in the entropy balance is associated with the interaction between the demon and the controlled subsystem; this interaction is identified as the measurement. The entropy change was divided into two parts, and it was postulated that the increase in one part is compensated by the decrease in the other. This conclusion is based on the belief a priori of the correctness of the second law in the strong form.
The introduction of the Landauer principle improves the structure of the analysis in general, since there is at least an independent principle associated with energy dissipation. Nevertheless, the logic remains unchanged. The second law in the strong form is accepted a priori, and the main difference from Brillouin’s analysis is tied to the possibility that processes in the daemon, as well as the interaction of the demon with the controlled subsystem, take place without dissipation. Thus, in the end, the necessary increase in entropy is associated only with erasing one bit of information in the demon.
There is an additional question. What, at the level of statistical mechanics, determines the dynamics of Szilard’s daemon? There are two possibilities by Bennett’s analysis - the laws of physics, or a computational algorithm that must be executed by Szilard’s demon. This issue of responsibility between physics and computation is considered in the next sections.

Landauer’s Principle, Reversible Computing, and Fluctuations

By Landauer’s principle and reversible computing, it is necessary to pay attention to two different meanings of the term reversible. On the one hand, this is related to the laws of mechanics, which are symmetrical in time. In mechanical devices such as the Fredkin-Toffoli billiard computer, this represents an ideal frictionless mechanical mechanism. The laws of frictionless mechanics contain time explicitly, so such a conceptual model allows motion in time.
On the other hand, in classical thermodynamics, reversible processes are introduced in the ideal Carnot cycle. The goal is to convert heat into work without dissipation, that is, without increasing entropy of the whole system. As a result, there is no time in the reversible processes of classical thermodynamics — see discussion in [11].
Let us return to the viewpoint of statistical mechanics in respect of the whole system - the Szilard’s demon plus the device controlled by it. The laws of mechanics are time symmetric; there is no friction and energy dissipation at the level of particle motion. At the same time, the discussion of increasing entropy needs energy dissipation, that is, the statistical interpretation of entropy. Therefore, it is important to distinguish two meanings of the term reversible, as it is unlikely that we could discuss Szilard’s demon meaningfully without time in the explicit form.
Let us look at Landauer’s principle from such a viewpoint. The discussion of the miniaturization of computation implicitly suggests that we talk about the motion of particles obeying the time symmetric laws of mechanics. Therefore, the question would be how energy dissipation occurs in this case.
It is impossible to assume that particles cease to obey the laws of mechanics by Landauer’s principle. Therefore, it follows that Landauer’s principle is based on a statistical interpretation of the second law. In turn, this raises the question of fluctuations; after all, it was the study of fluctuations in Brownian motion that forced scientists to take Maxwell’s demon seriously. However, by the analysis of Szilard’s demon based on reversible computing and Landauer’s principle, fluctuations in the demon are ignored.
The issue of fluctuations is especially important for Landauer’s principle. The dissipation of energy is necessarily connected with the statistical interpretation of the second law, but the introduction of fluctuations in the minimum memory cell (kT ln 2) makes it unworkable.

The Inadequacy of Szilard’s Thought Experiment

The purpose of a thought experiment in physics is to isolate certain features of a physical system that make it possible to formulate general conclusions. For example, the ideal Carnot cycle with reversible thermodynamic processes could be considered as a thought experiment to find the maximum efficiency of a heat engine. To this end, all losses have been eliminated, and the ideal reversible process of using heat to produce work was achieved [11].
This idealization is the basis of classical thermodynamics, and it was later generalized to other processes - chemical, electromagnetic, electrochemical, etc. Yet, the ideal Carnot cycle could be used to treat real systems, and thus the development of classical thermodynamics was successful. However, there was a price, the time is absent in classical thermodynamics, and all real processes are irreversible. They are necessarily accompanied by the dissipation of energy and hence the entropy of the isolated system increases in a real spontaneous process.
The goal of Szilard’s demon was apparently to create an idealized model for the study of information processes as physical. That is, the goal was to demonstrate the relationship between the change in thermodynamic entropy (k ln 2) and the information entropy associated with the choice of two possibilities (log2 2). In other words, the purpose of the thought experiment was to demonstrate the relationship between a single bit of information and a change in thermodynamic entropy; this way information would be related to a physical quantity. Let us consider whether such an expectation is justified.
In Szilard’s idealization, the behavior of a single molecule is analogous to a huge fluctuation in the system, but possible fluctuations in the housing, in the piston, and in the heat source are ignored. Moreover, the idealization includes elements from the ideal Carnot cycle - the movement of the partition without work and friction and the reversible expansion process after fluctuation. This, however, means that the estimate of the amount of work in such a thought experiment is overestimated.
In Szilard’s thought experiment, it is assumed that most processes can be performed as reversible thermodynamic processes without energy dissipation. At the same time, statistical mechanics comes in play, since the relationship between information and thermodynamic entropy could be achieved at this level only. Therefore, it is unclear how such an idealization agrees with the statistical interpretation of entropy, which underlies the whole argument.
A more realistic version of such a thought experiment would be related to the reduced size of the system. It is impossible to neglect fluctuations in all parts, since the whole idea is associated with the use of fluctuations to do useful work. The use of reversible thermodynamic processes for individual parts could not be justified. This is exactly the path used in modern experiments with Brownian particles [1], but the treatment of new experiments required the development of stochastic thermodynamics. In this respect, Szilard’s thought experiment was useful just as inspiration but not as the way to develop the correct formalism.

Levels of Organization: Informational vs. Physical

Let us consider the issue from a more general point of view. It was already mentioned that it is trivial to state that a physical device is required to transmit information and perform calculations. However, in thermodynamics of information, it is assumed that it could be possible to identify information or a logical operation with the change of thermodynamic entropy of a physical system.
Let us look again at the dynamics of the whole system from the viewpoint of statistical mechanics. The Hamiltonian corresponds to the laws of physics; hence it follows that dynamics is completely determined by the laws of physics. Thus, information processes are associated with dynamics, but strictly speaking, they cannot be considered as a causal explanation of the observed dynamics. To discuss this issue, we have to take another view.
For example, we know that this part of the system is the minimum memory cell, so we associate the energy dissipation of kT ln 2 with Landauer’s erasure of a bit. Another example is the Fredkin-Toffoli billiard computer. We know that the movement of the balls is related to computing and therefore, the final position of the balls is interpreted as the result of calculations. In other words, we have the dynamics of a physical system, we know that such dynamics is related to information or logical operations, and this allows us to associate thermodynamic and informational entropy.
Now let us discuss the inverse problem [12,13]. Say, Smoluchowski has considered Maxwell’s mechanical demons. They have their own dynamics in statistical mechanics, but without information processes. This suggests some differences in the dynamics of physical processes in Szilard’s information demon. Thus, the inverse problem could be stated as follows. The dynamics of an arbitrary system is observed without knowledge a priori about information processes. The goal would be a general procedure to differentiate processes with information and/or computing from pure mechanical processes.
However, there is no such procedure [12,13]. As an example, consider again a billiard computer. Could we say from the dynamics of the movement of the balls alone whether the calculation is currently being carried out? The calculation carried out is associated with conventions that are not included in the dynamics as such, and without knowledge of these conventions, it is impossible to correctly interpret the dynamics of the balls as computation. In general, the situation is even more complicated, and in the end, by considering any physical process one could always come up with conventions to interpret it as a particular calculation. A more detailed discussion of this issue is given in [12,13].
A similar problem occurs when trying to figure out whether there is information in an arbitrarily selected state of a physical system. For example, take a sheet of blank paper and look at it through a magnifying glass. Irregularities are visible - isn’t this information? Thus, to connect thermodynamic and informational entropy, some additional knowledge about a physical system is needed, which cannot be obtained directly at the level of the laws of physics.
Moreover, thermodynamic entropy is a property of substance, its value depends on external parameters, and by means of derivatives it is related to other thermodynamic properties [14]. Let us take as an example the Millipede memory device, which was once developed by IBM - it fits well to consider a mechanical system in statistical mechanics. In this device, the memory cell corresponds to a hole in the polymer film (the presence of a hole is one, the absence is zero).
Let us consider the thermodynamic entropy of such a polymer film with information. The change in thermodynamic entropy is associated with substance redistribution in the process of hole formation and formation of surface energy. If we neglect the change in thermodynamic entropy in the polymer film volume due to deformation (the mass of the film remains constant), then the change in thermodynamic entropy is proportional to the surface area of the holes.
Thus, in this case, it is impossible to associate thermodynamic entropy with information or algorithmic entropy, since the latter is determined by the location of the holes, which is not considered in thermodynamic entropy. Also, thermodynamic entropy is a function of temperature and pressure - these changes lead to a change in thermodynamic entropy. At the same time, provided that the holes are preserved during such a change, the information or algorithmic entropy remains constant.

Discussion

In my view, by discussing the miniaturization of computing devices, an important requirement related to the reliability of the device is overlooked. At the level of the slogan - robustness separates the physical and informational levels of the organization. Robustness is not about physics per se; robustness belongs to engineering sciences. For example, a minimum memory cell must retain information for a given amount of time. This requirement, in turn, assumes that during this time the state of the cell is stable with respect to fluctuations; the discussion of the minimum memory cell without this would be meaningless.
An engineer relies on the laws of physics to find the necessary solution, but the final solution is a compromise involving many factors. This does not allow us to consider information or a logical operation in a working device as a physical quantity. By the development of modern devices, the V-model is often employed, and the different levels of the organization are clearly visible in the V-model. From this point of view, an attempt to connect thermodynamic and information entropy would produce non-working devices.
On the other hand, the terms ‘information’ and ‘measurement’ have many meanings. For example, let us take thermodynamic tables: ‘NIST-JANAF Thermochemical Tables’ [15]. They contain information about thermodynamic properties of substances obtained during the processing of results of measurement from a huge number of experiments. However, the meaning of the terms ‘information’ and ‘measurement’ in this case does not coincide with those in Szilard’s demon.
Physical theory is expressed by mathematical equations. Bas van Fraassen has introduced the problem of coordination [16] to discuss the relationship between these equations and the world. Basically, there are two related questions: 1) What counts as a measurement of (physical quantity) X? 2) What is (that physical quantity) X? The physical theory answers the question of what a physical quantity is, and the physical theory introduces a conceptual model of an ideal experiment that serves as a blueprint for real measurements. This procedure is discussed for temperature in [17], and for entropy and other thermodynamic properties in [14]. The problem of coordination for information, however, remains unsolved in Szilard’s demon. In this case, it is impossible to give answers to the next questions: 1) What counts as a measurement of information; 2) What is information.
In modern experiments with Brownian particles, information appears in the form of mutual information [1]. A detailed analysis is beyond the scope of this paper, since it would require the discussion of stochastic thermodynamics. I just note that information in [1] appears due to a very simplified model of the controller, which played the role of Szilard’s demon. The description of the controller in mathematical formalism was reduced to the signal that the controller produces based on observations of the Brownian particle. It is unclear whether one could solve the problem of coordination for information in this way.
I conclude by a few quotes from Landauer’s latest paper, in which he moves from the physical nature of information to physical execution of physical laws [18]:
‘ Our view that mathematics is part of the physical world can be used to resolve a longstanding supposed puzzle. There are many references to Wigner’s remarks about “the unreasonable success of mathematics”. That success, in characterizing the physical world, is indeed a puzzle if we start from the presumption that mathematics existed before and exists apart from the physical universe. Instead, mathematics uses the physical universe, and is a part of physics.’
‘Our accepted laws of physics invoke continuum mathematics, which is, in turn, based on the notion that any required degree of precision can be obtained by invoking enough successive operations. But our real universe is unlikely to allow an unlimited sequence of totally reliable operations. The memory size is likely to be limited, perhaps, because the universe is limited.’
‘There is a tendency to think of mathematics as a tool which somehow existed before and outside of our physical world. ... Here, instead, we emphasize that information handling has to be done in the real physical world, and the laws of physics exist as instructions for information handling in that real world. It, therefore, makes no sense to invoke operations, in the laws of physics, which are not executable, at least in principle, in our real physical world.’
‘One possibility relates to the ultimate source of irreversibility and of fluctuations in the real world, a world where we can readily observe departure from Hamiltonian behavior. The lack of precision in the laws of physics is, essentially, a noise source. ... Limited precision may, furthermore, be the cause of the apparent classical behavior so conspicuously displayed around us.’
This is a good example where the desire to avoid idealism at one level (information as physical) leads to a view of the world kind of a computer. It also demonstrates that the discussion about mathematics, physics, and the world could easily come into the realm of eternal philosophical problems.

References

  1. Shoichi Toyabe, Takahiro Sagawa, Masahito Ueda, Eiro Muneyuki, and Masaki Sano. Experimental demonstration of information-to-energy conversion and validation of the generalized Jarzynski equality. Nature physics 6, no. 12 (2010): 988 - 992. [CrossRef]
  2. John Earman and John D. Norton. EXORCIST XIV: the wrath of Maxwell’s demon. Part I. From Maxwell to Szilard. Studies In History and Philosophy of Science Part B: Studies In History and Philosophy of Modern Physics 29, no. 4 (1998): 435 - 471.
  3. John Earman and John D. Norton. EXORCIST XIV: the wrath of Maxwell’s demon. Part II. From Szilard to Landauer and Beyond. Studies In History and Philosophy of Science Part B: Studies In History and Philosophy of Modern Physics 30, no. 1 (1999): 1 - 40.
  4. Martin J. Klein, Maxwell, His Demon, and the Second Law of Thermodynamics: Maxwell saw the second law as statistical, illustrated this with his demon, but never developed its theory. American scientist 58, no. 1 (1970): 84-97.
  5. Ya. M. Gelfer, History and Methodology of Thermodynamics and Statistical Physics (in Russian), 2nd ed., 1981, Chapter 12, Discovery and Study of Brownian Motion. Further Development of Boltzmann’s Statistical Theory.
  6. Vladislav Capek and Daniel P. Sheehan. Challenges to the second law of thermodynamics, 2005.
  7. Yu. G. Rudoi, Mathematical Structure of Equilibrium Thermodynamics and Statistical Mechanics (in Russian), 2013.
  8. Olimpia Lombardi and Cristian López. The many faces of Shannon information. In Information and the History of Philosophy, pp. 324-340. Routledge, 2021.
  9. Javier Anta Pulido, Historical and Conceptual Foundations of Information Physics, PhD Thesis, 2021.
  10. Lеon Brillouin, Science and Information Theory, 1956.
  11. Evgenii Rudnyi, Reversible Processes in Classical Thermodynamics, 2025, preprint. [CrossRef]
  12. O. J. Maroney, C. G Timpson. How is there a Physics of Information? On characterising physical evolution as information processing. In: Physical Perspectives on Computation, Computational Perspectives on Physics, edited by Michael E. Cuffaro, Samuel C. Fletcher, 2018, p. 103 - 126.
  13. Gualtiero Piccinini and Corey Maley, Computation in Physical Systems, The Stanford Encyclopedia of Philosophy, 2021.
  14. Evgenii Rudnyi, The Problem of Coordination: Entropy as a Physical Quantity in Classical Thermodynamics, 2025, preprint. [CrossRef]
  15. NIST-JANAF Thermochemical Tables, Fourth Edition, Monograph No. 9, Journal of Physical and Chemical Reference Data, 1998.
  16. Bas C. van Fraassen, Scientific Representation: Paradoxes of Perspective, 2008, Part II: Windows, Engines, and Measurement. [CrossRef]
  17. Evgenii Rudnyi, The Problem of Coordination: Temperature as a Physical Quantity, 2025, preprint. [CrossRef]
  18. R. Landauer, Information is a physical entity. Physica A: Statistical Mechanics and its applications. 1999, 263(1-4):63-7.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated