Introduction:
Entropy, as traditionally defined in statistical mechanics, is a measure of disorder based on the logarithm of the number of accessible microstates. While this formulation has been foundational in thermodynamics, it often lacks the resolution and intuitiveness needed to describe the step-by-step evolution of entropy in physical systems. The classical approach focuses primarily on macroscopic end states and provides limited insight into the irreversible dynamics of spontaneous entropy increase.
In contrast, the proposed energy product entropy — or multiplicative entropy — offers a fundamentally new perspective. By defining entropy as the product of energy values across all system units, this approach explicitly tracks the microscopic redistribution of energy at each stage of evolution. It provides a high-resolution, path-dependent description of entropy growth, naturally encoding time’s arrow through local energy flows. Furthermore, this formulation supports the development of analytic quantum thermodynamics, enabling precise simulation and mapping of entropy changes even at the Planck time scale.
This paper demonstrates how multiplicative entropy overcomes the limitations of traditional entropy measures and opens new avenues for understanding thermodynamic processes from a dynamic, computable perspective at the quantum scale.
Multiplicative Entropy-Energy Product Entropy: A New Analytical Framework for Understanding Irreversible Entropy Increase
If we adopt the concept of multiplicative entropy, it becomes much easier to understand why entropy increases spontaneously and irreversibly under the constraints of the second law of thermodynamics.
1. Definition of Multiplicative Entropy-Energy Product Entropy
I define multiplicative entropy as the product of the energies carried by each unit in a closed system. This definition of entropy is computable, and can also serve as an entropy coordinate for simulating the evolution of physical systems. When logarithmized, this multiplicative form reduces to the traditional additive entropy expression used in statistical mechanics.
Under the constraint of the second law of thermodynamics, energy always flows from units with higher energy to those with lower energy in a closed system. The total energy of all units remains constant (energy conservation), but the product of their energies keeps increasing — that is, the more uniform the energy distribution becomes, the larger the value of the multiplicative entropy.
When all units carry exactly the same amount of energy, the system reaches its maximum possible entropy — a state known as heat death.
This definition provides a much clearer and more intuitive description of entropy increase than the classical statistical formulation.
2.1. Quantum Thermodynamics Perspective
Speaking of quantum thermodynamics, if we assume that the quantum is the fundamental building block of space, then under the framework of a quantized space network, all states become analytic rather than statistical, specifically at the Planck time scale.
Let’s revisit the rule defined by the second law of thermodynamics: energy only flows from high-energy regions to low-energy ones. In essence, the process described by the second law — entropy increase — is a process of energy homogenization.
2.2. Comparison Between Multiplicative Entropy and Classical Statistical Entropy
| Dimension |
Multiplicative Entropy |
Traditional Statistical Entropy |
| Process Explicitness |
✅ Yes, every step has clear changes |
No, only describes macroscopic end states |
| Preserves Microscopic Details |
✅ Yes, path-dependent |
No, only cares about probability distributions |
| Has Temporal Directionality |
✅ Yes, defined by local energy flow |
No, needs extra assumption for time arrow |
| Suitable for Simulations |
✅ Yes, good for numerical modeling |
No, mainly used for theoretical analysis |
| Is an Analytical Function Form |
✅ Yes, not based on statistical average |
No, relies on ensemble average |
Why Is Multiplicative Entropy More Suitable for Describing Energy Homogenization?
Take for example a system composed of N basic quanta. According to the second law, energy transfers from high-energy quanta to low-energy ones, gradually making the overall energy distribution more uniform.
Assume each quantum carries energy mᵢ. At any given transformation state, the total energy of the system is:
∑ mᵢ = constant; This satisfies energy conservation in a closed system.
How do we characterize the entropy change during this evolution? My proposed multiplicative entropy offers a more intuitive and precise analytical tool:
The entropy S of the system at a certain transformation state is defined as:
S = ∏ mᵢ
Under the constraint of energy conservation and directional energy transfer (from high to low), the more uniform the energy distribution becomes, the larger the product becomes.
When all quanta carry equal energy, the system reaches its maximum entropy — heat death occurs.
After applying the logarithm, this formula transforms into the classical additive entropy form. However, I argue that the multiplicative form more accurately captures the spontaneous and irreversible nature of entropy increase, offering greater clarity and visual intuition.
2.3. Computable Definition of Multiplicative Entropy: Time–Entropy Mapping
In this entropy definition, the entropy value of a closed system at a given moment (i.e., during a specific state transition) is calculated as the product of the energy norms of all space elementary quanta (SEQ) involved in the spatial transformation at that moment.
Analysis Formula of Entropy
In this definition of entropy, the entropy value of a closed system at a given moment (i.e., during a specific state transition) is calculated as the multiplicative product of the energy norms of all SEQ involved in that transition(that moment’s space transformation).
3.1. Entropy value of closed system S=∏mᵢ, i∈N ①
Energy of closed system=constant=∑mᵢ, i∈N ②
3.2. Sₘₐₓ≤mᵢⁿ , When all mᵢ are equal or differ only by Planck's constant h
(Where mᵢ refers to the energy carried by the ith SEQ during a single transformation of the closed system, where each energy state mᵢ is an integer multiple of Planck’s constant h, mᵢ=nᵢh, nᵢ∈N)
3.3. Energy transfer rules and triggering conditions:
Energy exchange occurs between adjacent SEQ (i,j) if and only if the following thermodynamic gradient exists: mᵢ>mⱼ+h, Energy transfer occurs only in discrete quanta of Planck's constant h, mᵢ→ mᵢ−h; mⱼ→mⱼ+h (Planck's constant:h)
Numerical Example: System States and Entropy Evolution
Table 1.
Simplified Entropy Increase Demonstration.
Table 1.
Simplified Entropy Increase Demonstration.
| System State |
SEQ Energy Distribution Etotal=∑mᵢ=12 |
Entropy
S=∏mᵢ |
Remarks |
| Initial non-equilibrium state |
[3, 1, 5, 3] |
45 |
- |
| Intermediate state |
PathA[3, 1, 4, 4];
PathB[3, 2, 4, 3] |
A48;
B72; |
- |
| Final state |
PathA[3, 2, 3, 4];
PathB[3, 3, 3, 3]; |
A72;
B81; |
Due to adjacent energy transfer with minimal quanta h, this system cannot reach maximum entropy in case A |
3.4. Logarithmic Relation:
After logarithmic transformation, lnS aligns with the conventional Boltzmann entropy form, while the multiplicative formulation naturally suits discrete systems.
3.5. Proof of Spontaneous Entropy Increase
Spontaneity Theorem of entropy increase (Second Law of Thermodynamics):
For every possible energy transfer process, the total entropy change satisfies ΔS≥0.
Proof Outline: Let the pre-transfer states be mᵢ=a, mⱼ=b (a>b+h);
The post-transfer entropy ratio is:
Sₜ₁/Sₜ₂=(a−h)(b+h)/ab=1+h(a−b−h)/ab>1 (Planck's constant h)
This demonstrates how entropy increases along various paths, reflecting the irreversibility and path dependence of thermodynamic processes in real systems.
Moreover, each system state can be assigned a unique entropy value — an "entropy coordinate" , which provides a powerful tool for computer simulations at the quantum level.
Novelty of the Concept
After extensive searches using general search engines and academic databases, I find that the analytical entropy expression based on energy products I propose here offers a fundamentally new alternative outside the traditional statistical framework. This approach transforms entropy from a statistical concept into an analytic one, and to my knowledge, no similar formulation has been previously proposed.
Based on this idea, I introduce the concept of Analytic Quantum Thermodynamics, which allows us to better understand the essence of thermodynamic processes — essentially, they are processes of energy homogenization. And to describe such processes, the multiplicative entropy based on multiplication is an excellent tool.
☑ Distinction from Existing “Multiplicative Entropy” Concepts
Although the term “multiplicative entropy” or “multiplicative form of entropy” does appear in some literature, these are still variations of traditional statistical entropy, not departing from the following frameworks:
5.1. Statistical Approach Based on Number of States
For example, Boltzmann entropy
Or Shannon entropy
These so-called “multiplicative forms” often involve multiplying entropies of subsystems or using multiplicative techniques in mathematical derivations.
But fundamentally, they remain logarithmic functions over probability distributions.
5.2. No Microscopic Path Recording Function
They describe only macroscopic final-state differences, without retaining intermediate processes.
They do not capture how each individual energy transfer affects entropy.
☑ Unique Features of This Multiplicative Entropy Theory
| Feature |
This Multiplicative Entropy |
Existing “Multiplicative Entropy” |
| Can Track Evolutionary Paths |
✅ Yes, with clear changes at each step |
No, only describes macroscopic end states |
| Preserves Microscopic Details |
✅ Yes, path-dependent |
No, focuses only on probability distributions |
| Has Temporal Directionality |
✅ Yes, defined by local energy flow |
No, needs extra assumption for time direction |
| Suitable for Simulations |
✅ Yes, ideal for numerical modeling |
No, mainly used for theoretical analysis |
| Is an Analytical Function Form |
✅ Yes, not based on statistical averaging |
No, relies on ensemble averages |
☑ Why This Is a “New Discovery”
This model proposes a new way of constructing physical quantities, assigning them clear physical meaning and evolutionary rules:
It does not start from probability theory, but from the dynamical process of energy redistribution.
It naturally captures the trend toward homogenization through the product of energy values.
It provides a computable entropy coordinate, ideal for simulation, prediction, and visualization of system evolution.
It remains analytically valid at the Planck scale, suggesting natural extension into quantum thermodynamics.
This study presents an updated extension of a portion of the author’s prior work [
3]. For more details on the model, please refer to the previous paper presenting the complete framework.