Preprint
Article

This version is not peer-reviewed.

Burning Candle and Levels of Organization

Submitted:

24 February 2026

Posted:

03 March 2026

You are already at the latest version

Abstract
The burning candle discussed in Faraday's lectures is used as an example to discuss the relationships between physical theories at different levels of organization: continuum mechanics at the macro level and statistical and quantum mechanics at the micro level. The first part of the paper examines the connections between theoretical and experimental physics. Physics theory serves as the foundation of the research program, but experimental research is a measure of ongoing development. Reasonable extrapolationism denotes a situation where the ideas of physical theory contribute to the development of experimental research. On the other hand, radical extrapolationism is used for a situation where the ongoing discussion goes far beyond experimental physics. In the second part of the paper, the arrows of explanation between continuum mechanics and statistical mechanics are considered and classified within the framework of the proposed terminology. The estimation of the properties of a substance from molecular constants and the derivation of continuum mechanics equations from statistical mechanics are considered. Qualitative explanations and emergence are also discussed.
Keywords: 
;  ;  ;  ;  

Introduction

The title is based on Michael Faraday’s 1861 book, ‘The Chemical History of a Candle’ [1]. The book contains public lectures with a series of experiments to understand the processes of candle burning. Faraday noted the following:
‘There is not a law under which any part of this universe is governed which does not come into play and is touched upon in these phenomena. There is no better, there is no more open door by which you can enter into the study of natural philosophy than by considering the physical phenomena of a candle.’
Physics has advanced since then, but Faraday’s words still are valid in many ways. The heat from flame causes the candle stuff (usually a mixture of paraffin and stearin) to melt and form a cup of liquid at the top of the candle. The liquid stuff rises along the wick (capillary forces) and evaporates. The vapors react with oxygen from the air; different flame zones correspond to different degrees of oxidation of evaporation products. Chemical reactions release light and heat, the latter being used, among other things, to maintain the flow of fuel from the candle. At the same time, natural convection creates an air current along the candle, forming a flame body and supplying oxygen for combustion.
In Faraday’s book, the process of candle combustion was considered without the use of atomic concepts. Currently, this corresponds to an analysis by continuum mechanics at the macro level. At the same time, the development of physics since Faraday now allows for a discussion of candle burning using statistical mechanics at the micro level. Thus, candle combustion in this paper is used as an example to discuss the relationships between theories of physics used to describe different levels of organization. In this case, the theory of relativity, nuclear physics, and quantum field theory are not required, and this significantly simplifies the discussion.
In physics, the laws of physics at the micro level are considered fundamental, while theories at the macro level are often considered phenomenological. It is generally assumed that the phenomenological theories of physics are already explained within the framework of fundamental theories of physics. As an example, I take Einstein’s 1936 paper ‘Physics and Reality’ [2], in which he classifies continuum mechanics as a phenomenological theory:
‘Herein we find the hydrodynamic theory, and the theory of elasticity of solid bodies. These theories avoid the explicit introduction of material points by fictions which, in the light of the foundation of classical mechanics, can only have an approximate significance. … These two modes of application of mechanics belong to the so-called “phenomenological” physics. It is characteristic of this kind of physics that it makes as much use as possible of concepts which are close to experience but which, for this reason, have to give up, to a large degree, unity in the foundations. Heat, electricity and light are described by special variables of state and constants of matter other than the mechanical state; and to determine all of these variables in their relative dependence was a rather empirical task. Many contemporaries of Maxwell saw in such a manner of presentation the ultimate aim of physics, which they thought could be obtained purely inductively from experience on account of the relative closeness of the concepts used to the experience.’
At the same time, Einstein emphasizes that the transition to the micro level allows us to explain the processes at the macro level:
‘According to my belief, the greatest achievement of Newton’s mechanics lies in the fact that its consistent application has led beyond this phenomenological representation, particularly in the field of heat phenomena. This occurred in the kinetic theory of gases and, in a general way, in statistical mechanics. The former connected the equation of state of the ideal gases, viscosity, diffusion and heat conductivity of gases and radiometric phenomena of gases, and gave the logical connection of phenomena which, from the point of view of direct experience, had nothing whatever to do with one another. The latter gave a mechanical interpretation of the thermodynamic ideas and laws as well as the discovery of the limit of applicability of the notions and laws to the classical theory of heat. This kinetic theory which surpassed, by far, the phenomenological physics as regards the logical unity of its foundations, produced moreover definite values for the true magnitudes of atoms and molecules which resulted from several independent methods and were thus placed beyond the realm of reasonable doubt.’
Without a doubt, Einstein is in principle correct, and my discussion does not question the fundamental physical theories. Nevertheless, the best thing would be not to rush when considering the relationship of theories at the macro and micro level. It is better to examine more carefully the meaning of statements: ‘reducing one theory to another’, ‘explaining one theory by means of another’, and so on.
In his book ‘The Nature of the Physical World’ [3], Arthur Eddington notes the symbolism of the scientific world and introduces an interesting metaphor: ‘The external world of physics has thus become a world of shadows.’ Eddington compares the micro and macro worlds, but he uses two tables as an example. One table belongs to everyday life, and the other to the fundamental theories of physics:
‘Table No. 2 is my scientific table. ... My scientific table is mostly emptiness. Sparsely scattered in that emptiness are numerous electric charges rushing about with great speed; but their combined bulk amounts to less than a billionth of the bulk of the table itself.’
The difference from Eddington’s original discussion is as follows. I believe that in his example, Eddington should have additionally introduced the table at the level of continuum mechanics. The theory of continuum mechanics, like fundamental theories of physics, is written in the form of mathematical equations and, in this respect, is also symbolic. In this sense, a discussion is needed of how a world of shadows of continuum mechanics and a world of shadow of statistical mechanics are related to each other and to the burning candle.
In the first part of the paper, my interpretation of Eddington’s world of shadows is presented. The theory of physics is expressed by mathematical equations, and thus a special discussion of the connection between mathematics and the world through physics theory is required. The theory of physics is closely linked to the experiments being conducted, and this establishes a connection between a world of shadows and the real world. However, it is also possible to identify the level of experimental physics, which defines the scope of application of the theory in question. This approach allows for a different interpretation of Lakatos’s term research program in the case of the theory of physics.
Physics theory defines a research program in which physics theory is assumed to be universally valid. Universality takes us beyond the limits of experimental physics, and the term extrapolationism is used to emphasize this. Reasonable extrapolationism is associated with the current level of development of experimental physics within a research program, when the concepts of physics theory contribute to the development of experimental research. The transition to the discussion of equations that do not influence the development of experimental research takes us beyond experimental physics, and the term radical extrapolationism is used to describe this situation.
The second part of the paper examines the relationships between theories of physics at different levels of organization using the metaphor of arrows of explanation from Steven Weinberg’s book ‘Dreams of a Final Theory’ [4]:
‘We search for universal truths about nature, and, when we find them, we attempt to explain them by showing how they can be deduced from deeper truths. Think of the space of scientific principles as being filled with arrows, pointing toward each principle and away from the others by which it is explained. There arrows of explanation have already revealed a remarkable pattern: they do not form separate disconnected clumps, representing independent sciences, and they do not wander aimlessly - rather they are all connected, and if followed backward they all seem to flow from a common starting point. This starting point, to which all explanations may be traced, is what I mean by a final theory.’
Thus, in the second part of the paper, the arrows of explanation between continuum mechanics and statistical mechanics are considered, and they are classified according to the proposed terminology as reasonable or radical extrapolationism.

Physics, Mathematics and World

The discussion of the levels of organization by candle burning is conducted from the perspective of physicists who consider the theories of physics alongside the experiments conducted. This allows us to reduce philosophical questions about the nature of reality to a minimum. With this approach, on the one hand, there is no doubt about the reality of physicists and the reality of the burning candle, and the reality of the theory of physics and the mathematical equations in it, on the other. Clearly, these two realities have different modes of existence, but this approach eliminates the need for a more detailed discussion of the meaning of the words ‘reality’ and ‘exist’.
It is proposed to treat Eddington’s world of shadows as conceptual models of physical objects based on mathematical equations of physical theory. When physicists discuss combustion processes, they talk about such objects. During these discussions, conceptual models are projected into the world. For example, a physicist points to a candle flame and talks about the temperature field, the concentration field, and the chemical reactions that occur during candle combustion. Such concepts present no particular problems during discussion of the real candle flame due to the clarity of visual images. On the other hand, there are concepts for which creating a visual image and connecting it to the world is much more difficult. For example, the change in entropy, enthalpy, and Gibbs energy during chemical reactions, or the discussion of phase space in statistical mechanics.
First, the relationship between physical quantities and the world during real experiments is examined. The close connection between physical theory and experiments is discussed. Theory introduces physical quantities and defines conceptual models of an ideal measuring device and an ideal experiment. The use of real devices necessitates corrections, as well as the presence of measurement errors.
Next, the level of experimental physics is introduced, which can be distinguished despite the close relationship between physical theory and experiments. This step is necessary for discussing situations where theory is applied far beyond the scope of the experiments conducted. Taking the development of kinetic theory in the 19th century as an example, the term extrapolationism is introduced. The research program for kinetic theory proved successful, but further developments in physics led to rejection of the basic assumption of this theory. Finally, a distinction is made between reasonable and radical extrapolationism, where the discussion goes far beyond experimental physics.

Theory and Experiment in Physics

At the end of the 19th century, Pierre Duhem examined the relationship between the mathematical equations of physics and experiments in his book ‘The Aim and Structure of Physical Theory’ [5], but after that this issue received almost no attention in the philosophy of physics. A return to this problem appeared in Bas van Fraassen’s book [6] where he has introduced the problem of coordination. The essence of the problem is expressed by two joint questions about physical quantity:
  • What counts as a measurement of (physical quantity) X?
  • What is (that physical quantity) X?
Physics theory answers the question of what a physical quantity is, and physics theory also leads to a conceptual model of an ideal measuring instrument, which serves as the basis for constructing a real measuring instrument. I have used Duhem and van Fraassen’s approach to examining temperature [7] and thermodynamic properties [8]. Below are several quotes from Duhem’s book to illustrate this. First, the difference between an ideal and a real instrument:
‘Hence, when a physicist does an experiment, two very distinct representations of the instrument on which he is working fill his mind: one is the image of the concrete instrument that he manipulates in reality; the other is a schematic model of the same instrument, constructed with the aid of symbols supplied by theories; and it is on this ideal and symbolic instrument that he does his reasoning, and it is to it that he applies the laws and formulas of physics.’
Experimental measurements rely on metrology; that is, the introduction of standardized measurement scales. Instrument calibration procedures are developed to ensure the reproducibility of experiments conducted by different groups of physicists. The difference between a real and an ideal instrument necessitates the introduction of measurement corrections, which are related to the discrepancy between the conceptual model and the actual device [5]:
‘If an experiment in physics were merely the observation of a fact, it would be absurd to bring in corrections. ... The logical role of corrections, on the other hand, is very well understood when it is remembered that a physical experiment is not simply the observation of a group of facts but also the translation of these facts into a symbolic language with the aid of rules borrowed from physical theories. Indeed, a result of this is that the physicist constantly compares two instruments, the real one that he manipulates and the ideal, symbolic one on which he reasons.’
This approach can be extended to all properties of a substance in continuum mechanics, including the measurement of rate constants of chemical reactions. For statistical mechanics, there are new experiments to determine fundamental constants and conduct spectroscopic measurements. Duhem’s approach also applies to these experiments.
Duhem’s book draws attention to measurement errors. These include reading errors, systematic errors when no correction can be made, and random errors. Measurement errors are an important part of experiments. In rare cases, errors can be reduced to the level of thermal fluctuations, but typically, experimental errors remain unexplained as factors that lie outside the underlying physical theory and cannot be explained by that theory.
It’s important to note that increasing accuracy of measurements leads to a shift in our understanding of the processes taking place. For example, Mars’ orbit appears elliptical only within certain measurement errors. As Poincaré correctly noted [9]:
‘Long ago it was said: If Tycho had had instruments ten times as precise, we would never have had a Kepler, or a Newton, or Astronomy. It is a misfortune for a science to be born too late, when the means of observation have become too perfect.’

Level of Experimental Physics

The previous section emphasized the importance of physical theory in organizing and conducting experiments. However, experiments are conducted in the real world, and their results are related to the behavior of real devices. As a result, the relationship between theoretical and experimental physicists can be balanced by introducing a level of experimental physics. This refers to the scope of theory application, which is determined by the experiments performed and established metrology. For example, the successful introduction of standardization for measurements of a physical quantity demonstrates the validity of the corresponding physical theory within the scope of the measurements performed.
The philosophy of science discusses the incommensurability of theories after a scientific revolution and paradigm shift. Below are several examples to demonstrate that, in the case of established metrology, the existing measurements are preserved even after a scientific revolution. In a certain sense, the level of experimental physics remains invariant with respect to scientific revolutions.
Let me start with general considerations related to a burning candle outside the realm of physics. In the last quarter of the 18th century, the phlogiston theory was replaced by Lavoisier’s oxygen theory of combustion. A paradigm shift was evident, but it is impossible to say that proponents of different theories saw the burning candle differently. Both sides offered different explanations for the experiments, but they agreed on the descriptions of these experiments. Thus, proponents of different theories were able to engage in a common discussion of the observed phenomena and were able to translate the explanation from one theory to an alternative theory.
Let us switch to physics with calorimetry, which was developed within the framework of caloric theory. The arrival of thermodynamics (the mutual conversion of heat and work, the thermal equivalent of work) led to the rejection of caloric theory [10,11]. The change in physical theory led to a revision of the conceptual model of the experiments being conducted. This made it possible to explain experiments on the adiabatic expansion of gases (for example, Gay-Lussac’s 1807 experiment [10]), which could not be satisfactorily explained by caloric theory.
On the other hand, most experiments in calorimetry were conducted at constant pressure, and in this case, only the meaning of the measured quantity changed. Since heat ceased to be a function of state, the results of measurements in calorimeter at constant pressure began to refer to the change in enthalpy. The experimental procedure itself remained the same, and the transfer of previous measurements was reduced to renaming the obtained results.
The case with calorimetry demonstrates the possibility of separating experimental physics from theoretical physics. Theory provides meaning to the experiments conducted, but it is also possible that experiments go beyond the expectations of the theory. For example, experiments on the adiabatic expansion of gases in the case of caloric theory demonstrated the need to expand or replace the original theory. At the same time, the old numerical values of the measurements simply receive a new interpretation within the framework of the new theory.
The next section examines the development of kinetic theory. Similarly, in this case, the level of experimental physics can be separated from theoretical physics. In this section, I only give the development of spectroscopy in the 19th century as an example [12]. The conceptual model for measuring a spectrum developed from simple ideas about the interaction of light and electromagnetic waves with substances. The theories of 19th-century physics, including kinetic theory, could not explain the measurements performed in spectroscopy. An explanation of the results (the Balmer series and the Rydberg formula) became possible only after the development of quantum mechanics. It should be noted that one of the goals of the development of quantum mechanics was precisely the search for an explanation for the spectroscopic experiments of the 19th century [13]. This once again emphasizes the relative independence of experimental physics from the development of physical theory.
In conclusion, another paradigmatic example of the scientific revolution: the transition from the geo- to the heliocentric system. This transition did not affect existing measuring instruments or results on planetary positions. Proponents of the heliocentric system used the measurement results obtained by previous generations of astronomers. Thus, this radical shift in the worldview did not affect the measurements of planetary positions made within the previous worldview. The old measurements just obtained a new meaning. A similar situation happened when Kepler’s laws were replaced by Newton’s laws. The introduction of mutual attraction between the planets formally results in non-elliptical Mars’s orbit, but the difference falls within the limits of measurement error (see Poincaré’s quote [9] in the previous section).
The examples above demonstrate the validity of the experimental physics level. In the case of established metrology, the predictions of new theories of physics should agree with old measurements within the measurement error. The history of physics also provides examples of experiments whose results went beyond the expectations of existing theories, thus demonstrating the need to expand or replace existing theories.

Research Program as Extrapolationism

Physics theory provides meaning to experiments, and experimental physics defines the current scope of application of physics theory. At the same time, physics theory is given universality and is considered to hold true in all cases. This step is a part of the normal development of physics and, in this paper, is called extrapolationism. Imre Lakatos coined the term research program. From this perspective, a research program in physics can be interpreted as extrapolationism.
Physics theory provides a general framework and thus serves as a source of ideas for new experiments. New experiments expand the scope of the theory or necessitate reconsidering certain assumptions. At the same time, the development of metrology and new technologies expands the possibilities for new experiments and increases the accuracy of measurements.
Let us consider this idea using the development of kinetic theory in the 19th century as an example. After the proof of the conversion of heat into work and work into heat, and the rejection of caloric theory, a new explanation of heat was needed. To achieve this, it was proposed to use the laws of classical mechanics at the atomic level. This idea gained widespread acceptance after the work of Rudolf Clausius [11,14], although it had been proposed earlier. As a result, classical mechanics began to be used outside its scope, and the term extrapolationism vividly captures the essence of the development of kinetic theory.
The development of kinetic theory did not affect the experiments, the conceptual models of which remained within the framework of continuum mechanics. The goal of kinetic theory was to explain the results of these experiments based on atomic-molecular concepts. Thus, the research program consisted of developing the necessary mathematical formalism that would allow one to translate the laws of classical mechanics for moving atoms into experimental quantities in continuum mechanics.
Clausius demonstrated that the equation of state of the ideal gas corresponds to the absence of interactions between atoms, and that in this case, the gas temperature is related to the average kinetic energy of the atoms. This proved to be an important theoretical result for understanding the differences between the behavior of real gases and the ideal gas equation. Clausius’s conclusion fit well with the experimental results of the French physicist Henri Regnault, which demonstrated differences in the equations of state of real gases.
James Maxwell, using molecular kinetic concepts, demonstrated that the viscosity of the ideal gas is independent of density [14]. This facilitated the correct interpretation of experiments measuring gas viscosity. After that, the concept of the mean free path, the distribution of atomic velocities, and other interesting results appeared. Ludwig Boltzmann’s work on the arrow of time using the time-symmetric equations of classical mechanics led to a new understanding of thermodynamic entropy [11,14].
The development of the equipartition theorem led to theoretical values for the heat capacities of gases. However, there was a difference between the theoretical and experimental heat capacities that could not be resolved within the kinetic theory. Thus, experiments measuring the heat capacities of gases pointed to inherent problems in the basic postulates of kinetic theory. In a famous lecture in 1900, Lord Kelvin attributed this to the second cloud over classical physics [15].
The development of kinetic theory in the 19th century did not lead to the development of new experiments specific to this theory. Experimental physics in the 19th century remained in the framework of continuum mechanics, and therefore among physicists there were opponents of kinetic theory. It is interesting to note that in 1883, Ernst Mach, for general reasons, doubted the universality of the laws of classical mechanics (The Science of Mechanics) [16]:
‘The view that makes mechanics the basis of the remaining branches of physics, and explains all physical phenomena by mechanical ideas, is in our judgment a prejudice. Knowledge, which is historically first, is not necessarily the foundation of all that is subsequently gained. As more and more facts are discovered and classified, entirely new ideas of general scope can be formed. We have no means of knowing, as yet, which of the physical phenomena go deepest, whether the mechanical phenomena are perhaps not the most superficial of all, or whether all do not go equally deep. Even in mechanics we no longer regard the oldest law, the law of the lever, as the foundation of all the other principles. The mechanical theory of nature is, undoubtedly, in an historical view, both intelligible and pardonable; and it may also, for a time, have been of much value. But, upon the whole, it is an artificial conception.’
Further developments proved Mach’s point. Classical mechanics proved unsuitable for describing motion at the level of atoms and molecules, and the atomism of the 19th century, in the form of kinetic theory, gave way to quantum mechanics. Nevertheless, the success of this research program must be acknowledged. Kinetic theory ultimately led to new experiments in the 20th century, such as those related to Brownian motion [17] and highly rarefied gases (the mean free path is greater than the characteristic size of the system). Furthermore, the introduction of probabilities in the development of the formalism of kinetic theory prepared physicists for probabilities in quantum mechanics [14]. However, in any case, the success of experimental research was associated with the rejection of the foundations that had been used in the development of kinetic theory.

Reasonable and Radical Extrapolationism

The development of kinetic theory provides a good example for discussing the interaction between theoretical and experimental physics. Physical theory, in this case classical mechanics, is assumed to be universally valid, and in my terminology, this is extrapolationism. This section introduces a gray boundary between reasonable extrapolationism, which supports the development of a research program, and radical extrapolationism, which lies far beyond the scope of experimental physics. For example, the discussion of Laplace’s demon lies outside the realm of experimental research.
The proposed boundary is fuzzy and changes over time, as experimental and computing techniques advance. Physics is a quantitative science, and as the systems in question in experimental studies become more complex, it is necessary to perform advanced calculations. This includes solving partial differential equations in continuum mechanics, solving the Schrödinger equation in quantum chemistry, and estimating the configuration integral in statistical mechanics. In this sense, the development of experimental physics is complemented by the development of efficient numerical algorithms for solving the necessary equations, the latter being connected to the advancement of available computing power.
Thus, the proposed boundary cannot be drawn a priori, but the history of physics provides grounds for a distinction a posteriori. It is important to note that not every extrapolationism in physics has led to the establishment of a successfully developing research program. Thus, in the 19th century, Laplace’s molecular mechanics and Kelvin’s vortex atom theory can be considered failed research programs. In the book by the Danish historian Helge Kragh, ‘Higher Speculations: Grand Theories and Failed Revolutions in Physics and Cosmology’ [18], many such cases from the history of physics up to the present day are reviewed. This in no way means that extrapolationism is useless, since there is no other way for physics to develop.
Nevertheless, this allows us to offer an estimate a posteriori of the proposed boundary. It is necessary to monitor the connection between the theoretical promises of a given research program for the development of experimental research and computational algorithms. When there is no such development, then certain limits are achieved, when theoretical promises fail to materialize in the form of new experiments.
On the other hand, extrapolationism during the discussion can go away so far from the level of experimental physics that the very concept of a research program becomes impossible. This corresponds to considering written equations in principle when the ongoing discussion cannot be translated into the language of experimental research. For example, considering Laplace’s demon in any form cannot lead to the design of new experimental studies.

Arrows of Explanation

In the case of physicists discussing physical theories and experiments, considering the arrows of explanation is equivalent to examining the connections between conceptual models at different levels of organization. In this case, theories are compared between each other at the level of mathematical formalism. Within Eddington’s metaphor, a world of shadows of continuum mechanics and a world of shadows of statistical mechanics is related to the burning of a real candle, but the discussion is limited to comparing two worlds of shadows between each other.
This section examines several arrows of explanation. First, it examines the thriving research program of determining the properties of a substance from molecular constants. In this case, the equations of continuum mechanics remain unchanged, but it is assumed that the numerical values of the properties of a substance can be determined from atomic-molecular concepts. The success of statistical mechanics is largely due to this development. In some cases, first-principles calculations have been carried out with good accuracy, but in general, there is a hierarchy of approximations, which is connected to ongoing experimental research.
The next arrow of explanation is related to the research program of deriving the equations of continuum mechanics from the fundamental equations of quantum and statistical mechanics. In this case, success has been limited, and the connection to experimental research is rather indirect.
After that concepts in popular science literature are considered. There it is assumed that in the burning candle example, the entire process can be represented as follows. The system moves from one state to the next one according to the fundamental laws of physics. Such concepts translate into qualitative explanations, in which the emergence of new properties at the level of continuum mechanics is discussed. A look at the concept of emergence from the viewpoint of the mathematical formalism of theories at different levels of organization reveals the problematic nature of this concept.

Determining Properties of a Substance from Molecular Constants

In continuum mechanics, each substance requires a set of properties that must be determined experimentally. The theory defines a conceptual model of an ideal experiment, based on which the properties of a substance are measured in real experiments. These experimentally determined properties are then used to solve practical problems. Statistical mechanics offers a way to estimate the properties of a substance from its molecular properties and, in this sense, explains their relationship with molecular constants. This was the main point in Einstein’s quote [2] in the introduction. The use of atomic-molecular concepts allowed us to unify the various properties of a substance.
This arrow of explanation initially appeared in kinetic theory, where it was possible to relate the interaction potential between atoms with the equation of state. However, kinetic theory lacked conceptual models that could be used for the experimental study of interaction potentials. The interaction potentials must be known, but the theory was unable to provide experimental means for their determination.
In practice, interaction potentials could only be found from the values of macroproperties during the solution of the inverse problem. This led to the development of a corresponding research program. The form of interaction potentials with unknown parameters was chosen, which were then determined from experimental values of macroproperties by means of the inverse problem. A good description of the history of such research is given in Rowlinson’s book, ‘Cohesion. Scientific History of Intermolecular Forces’ [19].
The development of quantum mechanics changed the situation, since the interaction potential could be determined by solving the electron Schrödinger equation under the Born-Oppenheimer approximation. This led to a wave of enthusiasm among physicists; for example, in 1929, Paul Dirac said [20]:
‘The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble.’
Extrapolationism in this context means a transition to universality of a solution after initial successes. Physicists’ optimism led to the successful research program developing algorithms for the numerical solution of the electron Schrödinger equation and thus, to create computational chemistry [21]. Reasonable extrapolationism in this case is associated with an understanding of the limits of what is possible in this field.
Estimating the properties of a substance involves two steps. The first is solving the Schrödinger equation and finding the interaction potential. This step relates to experimental spectroscopy studies, which allow us to verify the reliability of the calculations. Spectra and interaction potentials are used in the second step to estimate the properties of a substance. In general, the intermolecular interaction potential leads to the appearance of a configuration integral, but in the case of a polyatomic ideal gas, only an intramolecular interaction potential is sufficient.
For relatively simple systems, it is possible by means of ab initio calculations to achieve an accuracy exceeding experimental [22], but for more complex substances, a hierarchy of approximations appears [23]. Ab initio calculations are replaced by semi-empirical methods, in which simplifications are introduced; as a result, some quantities are determined from experiments. Next, come molecular mechanics and molecular dynamics methods with an empirical force field when, figuratively speaking, as the system becomes more complex, approximation is adjusted by another approximation. The use of approximations is combined with parallel experimental studies, which allow us to select the appropriate level of approximation for solving a given problem.
Estimation of the configuration integral generally involves molecular dynamics and Monte Carlo simulations. A good example in this respect is the 2019 paper, ‘Thermophysical properties of the Lennard-Jones fluid: Database and data assessment’ [24]. This paper deals with a hypothetical substance consisting of point masses with the Lennard-Jones potential between them. This thought model is fully expressed by the mathematical equations of classical mechanics; therefore, it must correspond to a well-defined equation of state for gas and liquid. The paper compares the results of many studies to determine the equation of state and other thermodynamic properties:
‘The mutual agreement of these data sets is approximately ±1% for the vapor pressure, ±0.2% for the saturated liquid density, ±1% for the saturated vapor density, and ±0.75% for the enthalpy of vaporization – excluding the region close to the critical point.’
This isn’t a comparison of the experimental results, but rather a comparison between the computer simulation results of different groups. Hence, this demonstrates the accuracy of existing numerical algorithms for a relatively simple system.
Thus, reasonable extrapolationism consists of establishing a realistic border for the current capabilities to estimate the properties of a substance from molecular constants and realistic forecasts for future development. No doubt, since Dirac’s statement, significant progress has been made in the development of numerical algorithms and in the advancement of computing power. Hence the border of what is possible moves, but still many open problems remain. First and foremost, the transition to more complex systems is impossible due to the exponential growth of computing power requirements. It should also be noted that for a few properties, such as rate constants, the very principles of ab initio calculations are in their infancy.
Now let me return to Dirac’s statement in a literal sense. In this case, the claim that the theory is complete for absolutely all chemical systems is radical extrapolationism. Verifying this assertion requires calculations and comparison of the results with experiments. Without this, it is impossible to claim that all problems are due solely to a lack of computing power. However, such verification is impossible at present and in the foreseeable future. In other words, such an assertion goes far beyond experimental research and does nothing to advance it.

Deriving the Equations of Continuum Mechanics from Statistical Mechanics

Mathematics allows us to prove general theorems even in cases where equations cannot be solved, and thus new arrows of explanation appear. The task is to derive the equations of continuum mechanics from fundamental theories of physics in a general form. Interestingly, such a program makes the essence of Hilbert’s sixth problem [25] ‘Mathematical treatment of the axioms of physics’:
‘The investigations on the foundations of geometry suggest the problem: To treat in the same manner, by means of axioms, those physical sciences in which mathematics plays an important part; in the first rank are the theory of probabilities and mechanics.’
‘As to the axioms of the theory of probabilities, it seems to me desirable that their logical investigation should be accompanied by a rigorous and satisfactory development of the method of mean values in mathematical physics, and in particular in the kinetic theory of gases.’
‘Important investigations by physicists on the foundations of mechanics are at hand; I refer to the writings of Mach, Hertz, Boltzmann and Volkmann.It is therefore very desirable that the discussion of the foundations of mechanics be taken up by mathematicians also. Thus Boltzmann’s work on the principles of mechanics suggests the problem of developing mathematically the limiting processes, there merely indicated, which lead from the atomistic view to the laws of motion of continua.’
In equilibrium statistical mechanics, there is a derivation of the fundamental equation of classical thermodynamics [26]. It is used, among other things, to prove the relationship between changes in the Helmholtz energy and the partition function. This allows us to speak of a certain arrow of explanation of thermodynamics based on statistical mechanics. However, this derivation lies beyond the Hilbert ideals, since it uses a number of additional assumptions not contained in the original equations of classical mechanics as follows: the equal a priori probability of microstates in the microcanonical ensemble, the postulate of the arrow of time, and the requirement to determine the numerical value of the Boltzmann constant from experiments [27]. Moreover, this derivation is limited to considering only the expansion work.
The situation is significantly worse in the case of proving the Clausius inequality and finding the arrow of time in non-equilibrium statistical mechanics. A reasonable arrow of explanation exists only when considering a monatomic ideal gas in Boltzmann’s statistical interpretation of the second law. Gibbs’s generalization of the entropy of a system in the general case is connected with the peculiarities of Liouville’s theorem, when the entropy of a system formally remains constant during an irreversible process. Attempts to transfer Boltzmann’s ideas to Γ-space remain at the level of qualitative explanations [28].
There is no time in classical thermodynamics, but the equations of continuum mechanics explicitly include time, and they are time asymmetric. Practical work in non-equilibrium statistical mechanics always contains additional postulates that lead to the arrow of time. The question of the arrow of time in statistical mechanics has been discussed for a long time, but a satisfactory solution remains elusive. Moreover, by this discussion there is no connection with experimental research.

Burning Candle at the Level of Fundamental Equations of Physics

Certain advances in statistical mechanics in estimating the properties of a substance from molecular constants lead to radical extrapolationism, in the form of assertions that the entire process of candle burning can be described directly at the level of fundamental theories of physics. Let me take Sean Carroll’s book, ‘From Eternity to Here: The Quest for the Ultimate Theory of Time’ [29] as an example. It states that the world passes from one state to the next one according to the laws of physics. A couple of quotes from the book:
‘The laws of physics can be thought of as a machine that tells us, given what the world is like right now, what it will evolve into a moment later.’
‘That’s a standard way of thinking about the laws of physics ... You tell me what is going on in the world (say, the position and velocity of every single particle in the universe) at one moment of time, and the laws of physics are a black box that tells us what the world will evolve into just one moment later.’
Carroll speaks of the world, but I limit myself to the process of candle burning, for the sake of clarity, in an isolated system. Let us consider the concept of a local Laplace demon to describe such a system.
The development of numerical methods based on finite elements and finite volumes, coupled with increased computing power, has led to the creation of software that makes continuum mechanics accessible to engineers. The modeling process is conducted in a user-friendly graphical interface, where a 3D model is discretized using mesh generators, and thus the job is converted to a computational problem. A complete calculation of a burning candle at this level is already possible, although engineers prefer to break the problem down into smaller parts to find more efficient solutions to practical problems.
In this case, Carroll’s statement accurately conveys the essence of such software, but Carroll was referring to the fundamental laws of physics. In this case, there is a system of equations that cannot be solved in principle, since including so many particles in the analysis is unthinkable. Even a complete notation of such a system of equations becomes impossible, let alone its solution. This type of discussion is an unmistakable sign of radical extrapolationism, in the spirit of the discussion of Laplace’s demon. In this form, the connection with experimental research is completely lost.
Moreover, there are serious problems even at the conceptual level, since it becomes unclear which laws should be used to estimate the transition of an isolated system with a burning candle from one state to the next one. Carroll’s statement applies to classical statistical mechanics, which underlies numerical algorithms for molecular dynamics. However, this level of approximation is too rough. The process requires considering chemical reactions, but the conceptual problem can be illustrated by a much simpler problem: accounting for vibrational motions in polyatomic molecules.
As already mentioned, in the 19th century, the discrepancy between experimental heat capacities of diatomic molecules and the predictions of kinetic theory was one of the first signs of the inapplicability of classical mechanics to describing molecular motion. Including vibrational motion in the analysis led to overestimated heat capacity of diatomic gases and subsequent experiments showed that heat capacity depends on temperature.
A figure from Wikipedia (Morse potential) [30] helps to better understand the problem of vibrational motion:
Preprints 200209 i001
It shows potential energy as a function of the distance between atoms. The green line represents the harmonic oscillator approximation, and the blue line represents the actual curve, when increasing distance leads to molecule dissociation. Such a potential curve can be calculated by solving the Schrödinger electron equation. Vibrational energy is discrete, and the difference between vibrational levels at room temperature exceeds the available thermal energy.
As the temperature increases, a sufficient number of vibrational levels are excited, corresponding to the activation of a vibrational degree of freedom in classical theory. At low temperatures, all molecules are practically at the ground vibrational level, and vibrational motion is effectively switched off. This explains how the term “frozen degree of freedom” originated. The molecule has degrees of freedom that ceased to be active at low temperatures. This behavior cannot be described in classical mechanics.
It is unclear which laws should be used to correctly account for the vibrational motions of molecules during the transition of the entire system from one state to the next one. Quantum mechanics of vibrational states deals with the wave function, but it is unclear how to combine the vibrational wave function with classical mechanics to make it consistent with Carroll’s description. Taking into account the evolution of the wave function of the entire system, like a burning candle, makes the situation even worse, since it becomes unclear how to obtain the combustion process itself. This is one of the problems with quantum mechanics: it lacks a smooth transition from quantum-mechanical phenomena to classical ones.
As already mentioned, a hierarchy of approximations is used in practice, but in this case, Carroll’s coherent picture is disrupted. Selecting the correct level of approximation requires understanding the specific conditions of the problem in question. A return to experimental research is only possible by rejecting the language of radical extrapolationism.

Arrows of Explanation and Emergence

Let us consider emergence—it is often discussed when different levels of organization are at play. Typically, the topic of emergence arises in the qualitative discussion of the previous section about the transition of the world from one state to the next one according to the laws of physics. In such a discussion, emergence concerns the macroproperties of a substance in respect to the properties of atoms, but the discussion is exclusively qualitative. It is argued that the properties of a substance must somehow emerge from the movement of atoms, and then the question is considered whether the new emerging entity at a higher level of organization can influence the behavior of a lower level.
Below is a brief overview of this problem within the framework adopted in this paper. Let us imagine physicists discussing the mathematical formalism of physical theories at the macro and micro levels. In this case, it becomes unclear how qualitative discussion of emergence can be connected to physical theories. Let us consider several examples.
Currently, there are debates in philosophy of chemistry about the Born-Oppenheimer approximation as an example of emergence in chemistry [31]. The idea is that chemistry is based on a concept of molecular structure that cannot be found before the Born-Oppenheimer approximation. This discussion of the emergent nature of molecular structure raises many questions, since the statement ‘the Born-Oppenheimer approximation emerges’ does not seem meaningful. In this case, it is more accurate to say that there is no rigorous transition from the equations of fundamental physics to molecular structures in chemistry.
The relationship between the properties of a substance in continuum mechanics and molecular constants was discussed in the first section. Typically, the Born-Oppenheimer approximation is employed, but the relationship remains valid even without the Born-Oppenheimer approximation. This significantly complicates the calculations, making them currently only practical for extremely simple systems. Thus, from this perspective, the typical question arises again about the limits of applicability of the original equations during quantum mechanical calculations.
Similarly, it is impossible to speak about emergence in the case of the arrow of time in continuum mechanics. A more accurate statement is that certain elements of continuum mechanics cannot be found in statistical mechanics without additional assumptions. In other words, one should not say that the arrow of time emerges when considering mathematical equations; it’s more accurate to speak of the limitations of the mathematical proofs available.
The emergence of temperature should be considered in the same way. In equilibrium statistical mechanics, there is an equivalent of thermodynamic temperature in the form of a parameter of the canonical Gibbs ensemble distribution. In non-equilibrium statistical mechanics, there are states in which there is no temperature, but in these cases, there are relaxation processes that lead to the establishment of local thermal equilibrium. Thus, temperature is related to relaxation processes, not to emergence.

Discussion

There are interesting and useful results that connect theories of physics in continuum mechanics and at the atomic-molecular level. However, given the current state of physics, it is impossible to rigorously derive the equations of continuum mechanics from the equations of quantum and statistical mechanics. Hilbert’s sixth problem remains unsolved, and thus it is impossible to say that all fundamental questions have been resolved. To assert that these problems have been resolved in principle takes the discussion beyond the realm of physics as a natural science.
The proposed terminology allows us to more clearly characterize the current status of discussions about various levels of organization in modern physics. For example, examining the current status of the research program for determining macroproperties from molecular constants allows us to distinguish between reasonable and radical extrapolationism. If the discussion ultimately leads to the development and expansion of experimental or computational capabilities, then it belongs to reasonable extrapolationism within the current research program. Otherwise, there is a transition to radical extrapolationism, which further leads the discussion to one or another philosophical position. Such discussions are also useful, but in this case, a gap with experimental physics must be acknowledged.
The proposed boundary between reasonable and radical extrapolationism is not absolute, as it changes over time. Nevertheless, such a distinction allows for a more precise tracking of the development of physics and the dynamics of this development — what changes have occurred over the past ten, twenty, thirty, and so on years. This, in turn, allows for more meaningful predictions about future developments. Otherwise, discussion of the position that quantum mechanics already encompasses all of chemistry is reduced to the unconstructive dialogue below:
-
-Pro: ‘The advances of quantum mechanics are limited solely by the lack of computing power.’
-
-Con: ‘Will you be able to compute a real chemical system until the heat death of the universe?’
In conclusion, let me give an example where technological advances made interesting experiments at the mesoscale possible [32]. This requires explicit consideration of fluctuations, which in turn led to the creation of the formalism of stochastic thermodynamics. New experimental results and the new formalism stimulated discussion of the relationship between continuum mechanics and statistical mechanics. Thus, experimental research at the mesoscale influenced the further development of statistical mechanics. This emphasizes the validity of considering experimental physics and the need to include this in discussions of the philosophy of physics.

References

  1. Faraday, M. The Chemical History of a Candle; 1861. [Google Scholar]
  2. Einstein, A. Physics and Reality. Journal of the Franklin Institute 1936, 221(no. 3), 349–382. [Google Scholar] [CrossRef]
  3. Eddington, A.S. The Nature of the Physical World, 1928, Introduction.
  4. Weinberg, S. Dreams of a Final Theory: The Scientist’s Search for the Ultimate Laws of Nature; 1993. [Google Scholar]
  5. Duhem, P. The Aim and Structure of Physical Theory; First published in French in 1906; 1954. [Google Scholar]
  6. van Fraassen, B.C. Scientific Representation: Paradoxes of Perspective, 2008, Part II. Windows, Engines, and Measurement.
  7. Rudnyi, E. The Problem of Coordination: Temperature as a Physical Quantity 2025, preprint. [CrossRef]
  8. Rudnyi, E. The Problem of Coordination: Entropy as a Physical Quantity in Classical Thermodynamics; 2025; Volume preprint. [Google Scholar] [CrossRef]
  9. Poincaré, H. Science and Hypothesis; 1905. [Google Scholar]
  10. Krichevskii, I.R. The Concept and Fundamentals of Thermodynamics; (in Russian). 1970. [Google Scholar]
  11. Gelfer, Y.M. History and Methodology of Thermodynamics and Statistical Physics, (in Russian), 2nd ed.; 1981. [Google Scholar]
  12. Kudryavtsev, P.S. From Antiquity to Mendeleev. In History of Physics; (in Russian). 1956; Volume v. 1. [Google Scholar]
  13. Arabatzis, T. Representing Electrons 2005.
  14. Brush, S. The Kind of Motion We Call Heat; 1976. [Google Scholar]
  15. Kelvin, L. Nineteenth Century Clouds Over the Dynamical Theory of Heat and Light. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science 1901, 2(no. 7), 1–40. [Google Scholar] [CrossRef]
  16. Mach, E. The Science of Mechanics. A Critical and Historical Acount of its Development; First edition in German 1883; 1919. [Google Scholar]
  17. Smith, G.E.; Seth, R. Brownian Motion and Molecular Reality; 2020. [Google Scholar]
  18. Kragh, H. Higher Speculations: Grand Theories and Failed Revolutions in Physics and Cosmology; 2011. [Google Scholar]
  19. Rowlinson, J.S.; Cohesion. Scientific History of Intermolecular Forces; 2002. [Google Scholar]
  20. Dirac, P.A.M. Quantum Mechanics of Many-Electron systems. In Proceedings of the Royal Society of London. Series A, Containing Papers of a Mathematical and Physical Character; 1929; 123, pp. 714–733. [Google Scholar]
  21. Kampouridis, S. Bytes as Test Tubes: The Emergence of Computational Quantum Chemistry. PhD dissertation, 2022. [Google Scholar]
  22. Garberoglio, G.; Gaiser, C.; Gavioso, R.M.; et al. Ab initio calculation of fluid properties for precision metrology. Journal of Physical and Chemical Reference Data 2023, 52(no. 3). [Google Scholar] [CrossRef]
  23. Leach, A.R. Molecular Modelling: Principles and Applications; 2009. [Google Scholar]
  24. Stephan, S.; Thol, M.; Vrabec, J.; Hasse, H. Thermophysical properties of the Lennard-Jones fluid: Database and data assessment. Journal of chemical information and modeling 2019, 59(no. 10), 4248–4265. [Google Scholar] [CrossRef] [PubMed]
  25. Hilbert, D. Mathematical Problems. Bulletin of the American Mathematical Society 1902, 8(10), 437–479. [Google Scholar] [CrossRef]
  26. Borshchevsky, A.Y. Physical Chemistry. In Statistical Thermodynamics; (In Russian). 2023; Volume 2. [Google Scholar]
  27. Nesterenko, V.V. No. JINR-R--17-2009-6; The role of Gibbs ensembles in statistical thermodynamics. Bogolyubov Lab. of Theoretical Physics, 2009.
  28. Rudnyi, E. Entropy in Statistical Mechanics: Back to Boltzmann, preprint ed; 2026. [Google Scholar] [CrossRef]
  29. Carroll, S. From Eternity to Here: The Quest for the Ultimate Theory of Time; 2010. [Google Scholar]
  30. Wikipedia, Morse potential, access 20.02.2026.
  31. Philosophical Perspectives in Quantum Chemistry; 2022.
  32. Seifert, U. Stochastic thermodynamics, fluctuation theorems and molecular machines. Rep. Prog. Phys. 2012, V. 75. 126001, 1–5. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated