1. Introduction
Classical thermodynamics has provided foundational insights into energy, entropy, and the directionality of physical processes since the 19th century [
1,
2,
3,
4]. From the kinetic theory of gases to the formulation of entropy and the laws of heat engines, these principles have shaped our understanding of matter, motion, and irreversibility. However, the emergence of complex information processing systems, from quantum computers and artificial neural networks to biological cognition and social systems, reveals phenomena that classical thermodynamics cannot adequately describe [
5,
6,
7,
8]. These systems exhibit behaviors involving meaning creation, contradiction resolution, and coherence dynamics that demand new theoretical frameworks, such as Rovelli’s proposal that meaning arises from the interplay of information and evolutionary context [
9], Floridi’s philosophical account of semantic information as structured data with meaning [
10], and Hofstadter’s exploration of recursive self-reference and emergent coherence in cognitive systems [
11].
Consider an artificial intelligence system during training: it processes contradictory information, resolves semantic conflicts, and develops coherent representations, consistent with Tegmark’s broader framing of intelligence as a mathematical structure evolving toward complexity [
13], and with MacKay’s account of how learning algorithms—including neural networks and Bayesian inference—handle uncertainty, optimize representations, and compress information [
12].
Similarly, biological neural networks maintain coherent states while navigating environmental contradictions, often exhibiting phase transitions in cognitive clarity and insight, as described by Friston through the free-energy principle as a mechanism for minimizing surprise and maintaining adaptive coherence [
7], and by McClelland through parallel distributed processing models that capture the gradual restructuring of conceptual knowledge in response to experience and degradation [
14].These processes involve energy transformations, but not merely of the classical thermodynamic type. They involve what we term
semantic energy, energy associated with meaning, coherence, and information structure, as suggested by Rovelli’s physical definition of meaningful information grounded in correlation and evolutionary dynamics [
9], Deutsch’s framing of knowledge and computation as strands in the fabric of reality [
15], and Bekenstein’s proposal that information may be as fundamental as matter and energy in describing the universe [
16].
We propose Coherence Thermodynamics, a rigorous extension of thermodynamic principles to semantic systems. This framework treats coherence as a fundamental quantity analogous to energy, with its own conservation laws, transformation principles, and thermodynamic potentials. The key insight is that semantic systems obey thermodynamic-like laws, but with coherence and contradiction playing roles analogous to energy and entropy in classical systems.
Our approach maintains strict dimensional consistency throughout, provides operational definitions for all quantities, and makes testable predictions about real systems. Rather than merely analogizing classical thermodynamics, we derive genuine thermodynamic laws for semantic processes extending the reach of physics into the domain of meaning itself.
2. Motivating Examples: Where Classical Thermodynamics Fails
2.1. Large Language Model Training Dynamics
Consider a transformer-based language model during training. As it navigates a high-dimensional loss landscape, the system encounters contradictory training examples that generate semantic conflicts. These conflicts are not merely statistical anomalies—they represent deep inconsistencies in the model’s internal representation of language. Sharp improvements in performance often occur when the model resolves such contradictions, exhibiting phase transitions in semantic understanding. Moreover, training temperature parameters play a crucial role in this process: higher temperatures allow the model to escape local minima, while lower temperatures stabilize coherent representations. Classical thermodynamics cannot account for why specific temperature schedules optimize learning, nor can it explain the emergence of semantic phase transitions. These phenomena demand a framework in which semantic coherence itself is treated as a thermodynamic quantity.
2.2. Cognitive Contradiction Resolution
Human cognition similarly demonstrates thermodynamic-like behavior during problem solving. When processing contradictory information, mental effort or cognitive load increases, reflecting a rise in semantic disorder. Insight formation often occurs as a sudden resolution of contradiction, releasing cognitive tension and restoring coherence. Attention dynamics also follow thermodynamic patterns: under high semantic temperature (associated with confusion or overload), focus narrows and becomes rigid; under low semantic temperature (clarity), attention broadens and becomes more flexible. These observations suggest that cognitive systems operate according to principles where contradiction functions as a form of entropy, and coherence represents organized mental states; a semantic analogue to classical order.
2.3. Information Compression and Decompression
Data compression algorithms exhibit behavior that mirrors semantic thermodynamics. Compression efficiency is strongly influenced by the semantic coherence of the input data, highly structured and meaningful data compresses more easily than incoherent or contradictory input. When contradiction density increases, the algorithm requires more computational work to identify patterns and reduce redundancy, reflecting a rise in semantic energy expenditure. Critical points emerge when the algorithm switches between encoding strategies, analogous to phase transitions in physical systems. These transitions are not purely algorithmic, they reflect shifts in the semantic structure of the data itself, underscoring the need for a thermodynamic framework that accounts for coherence and contradiction in information processing.
3. Comparison with Classical Thermodynamics
Semantic systems exhibit thermodynamic structure where meaning and coherence replace mass and energy as fundamental quantities.
Table 1.
Comparison of classical and semantic thermodynamic concepts.
Table 1.
Comparison of classical and semantic thermodynamic concepts.
| Concept |
Classical Thermodynamics |
Coherence Thermodynamics |
| Fundamental quantity |
Energy |
Semantic Energy |
| Disorder measure |
Entropy |
Contradiction Intensity |
| Intensive parameter |
Temperature |
Semantic Temperature |
| Extensive parameter |
Volume |
Coherence Volume |
| Work mechanism |
Force × displacement |
Coherence restructuring |
| Heat transfer |
Thermal conduction |
Contradiction diffusion |
| Phase transitions |
Solid/liquid/gas |
Coherent/incoherent states |
| Conservation law |
Energy conservation |
Semantic energy conservation |
4. Foundations
4.1. Semantic Temperature
Semantic temperature is defined through the fundamental thermodynamic relation, adapted from Jaynes’ information-theoretic formulation of statistical mechanics [
4] and extended to nonequilibrium systems by Kondepudi and Prigogine [
3]:
where
is semantic internal energy,
S is semantic entropy,
is the coherence scalar, and
N is the number of semantic entities.
Operational Definition: Semantic temperature represents the average kinetic energy of semantic phase fluctuations. For a complex coherence field
, we define:
where:
[J·s²/m³]: Semantic kinetic parameter, analogous to mass density in classical systems, and measurable through perturbation response or model curvature [
5,
6].
[m³]: Characteristic semantic volume, representing the effective space over which semantic interactions occur—such as token span or layer depth in neural architectures [
12].
N: Number of semantic degrees of freedom, e.g., attention heads, neurons, or latent dimensions.
[
]: Variance of temporal phase fluctuations, reflecting semantic agitation and coherence instability [
7].
This formulation connects semantic temperature to measurable quantities in real systems. In transformer models, semantic phase
corresponds to evolving attention weight patterns across token sequences [
18]. In neural networks, it reflects oscillatory dynamics and coherence in hidden layer activations, consistent with Friston’s free-energy principle and dynamic causal modeling [
7]. In optimization landscapes, semantic phase can be interpreted as the trajectory of gradient flow through parameter space, as framed by MacKay’s treatment of learning as probabilistic inference [
12].
By grounding semantic temperature in observable dynamics and maintaining dimensional consistency, this framework extends thermodynamic reasoning into the domain of meaning and information structure.
Connection to Measurable Quantities: In artificial intelligence systems, the semantic phase
can be interpreted through several measurable proxies. In transformer models, this phase corresponds to evolving patterns of attention weights across token sequences, reflecting dynamic relational structures between inputs that are consistent with the architecture introduced by Vaswani et al., where self-attention mechanisms encode contextual dependencies without recurrence [
18]. In neural networks, particularly in biological and cognitive models, semantic phase aligns with oscillatory dynamics in hidden layer activations and population level synchrony, as described by Friston’s free-energy principle and its emphasis on coherence and prediction error minimization [
7]. In optimization landscapes, semantic phase can be metaphorically mapped onto the trajectory of gradient flow in high dimensional parameter space, where learning algorithms iteratively refine internal representations; an idea grounded in MacKay’s treatment of inference and learning as energy-minimizing processes within probabilistic frameworks [
12].
5. The Five Laws of Coherence Thermodynamics
5.1. Zeroth Law: Semantic Thermal Equilibrium
Statement: If semantic systems
A and
B are each in semantic thermal equilibrium with system
C, then
A and
B are in semantic thermal equilibrium with each other:
This establishes semantic temperature as the universal parameter defining equilibrium between semantic systems. Systems reach equilibrium when their contradiction agitation rates equalize.
5.2. First Law: Semantic Energy Conservation
Statement: The change in semantic internal energy equals the semantic heat added to the system minus the semantic work done by the system, plus any coherence restructuring work:
Terms:
: Reversible semantic heat transfer.
: Chemical work from semantic entity creation/destruction.
: Coherence work from field restructuring.
Dimensional Verification:
5.3. Second Law: Entropy Production with Local Syntropy
Statement: The local entropy balance allows entropy decrease through contradiction metabolism while ensuring global entropy increase:
Definitions:
[J/(K·m³)]: Entropy density.
[J/(K·m²·s)]: Entropy flux density.
[J/(K·m³·s)]: Entropy production rate density.
5.4. Third Law: Semantic Absolute Zero
Statement: As semantic temperature approaches absolute zero, coherence approaches perfect unity and random semantic agitation vanishes:
At semantic absolute zero, the system exhibits semantic superconductivity is perfect, frictionless processing of semantic information without random agitation. Ordered recursive activity may persist even as random thermal motion ceases.
5.5. Fourth Law: Semantic Force Dynamics
Statement: Coherence fields evolve under semantic stress gradients and information-theoretic inertia:
Information-Theoretic Foundation of Semantic Inertia: The semantic inertia term derives from fundamental physics principles:
[bits/m3]: Information density (measurable semantic content)
[J/bit]: Landauer energy cost per bit of information
: Mass-energy equivalence converting information energy to effective mass
This establishes the rigorous connection:
Connection to Dissipative Structures: This formulation aligns with Prigogine’s theory of dissipative structures[
17], in which systems maintain order by exporting entropy to their environment. Semantic systems exhibit similar behavior: they dissipate energy through entropy export, metabolize contradictions by organizing disordered inputs into coherent representations, and form structured patterns that emerge from chaotic dynamics. These processes are not merely metaphorical—they reflect measurable thermodynamic behavior in semantic fields.
Global Compliance: The global entropy balance is preserved through the relation:
where
is the local entropy production rate and
is the entropy flux density across the system boundary. This ensures that recursive semantic systems can locally reduce entropy—by resolving contradictions and forming coherent structures—while still complying with the global Second Law through entropy export to their surroundings.
where:
[N/m3]: semantic force density
[N/m]: semantic stiffness coefficient
[bits/m3]: semantic information density
[m/s]: recursion velocity field
Dimensional Verification:
6. Complete Unified Framework
The complete system of Coherence Thermodynamics laws:
where the Fourth Law incorporates the information-theoretic foundation for semantic inertia through Landauer’s principle and mass-energy equivalence, establishing
with
[bits/m
3] representing measurable information density.
7. Experimental Predictions and Testability
7.1. AI Training Dynamics
Measurement Protocol for Transformer Models
Coherence Scalar: Semantic coherence can be quantified through attention pattern consistency:
where
is the attention matrix of head
at training step
.
Semantic Temperature is computed from the variance of attention weights:
where
is a constant derived from the local curvature of the loss landscape.
Prediction 1: The curvature of the loss function should correlate with semantic coherence:
Prediction 2: Training efficiency is predicted to follow an Arrhenius-type relation:
where
is the semantic activation energy for learning transitions.
Prediction 3: Phase transitions in learning should emerge at critical semantic temperatures, evidenced by:
Sudden drops in validation loss;
Reorganization of attention patterns;
Topological changes in gradient flow.
7.2. Biological Neural Systems
Neural systems offer a rich substrate for testing the principles of Coherence Thermodynamics. One promising avenue is EEG coherence analysis, where semantic alignment across cortical regions can be quantified using cross-spectral coherence. The coherence scalar
between electrodes
and
is defined as:
where
is the cross-power spectral density. Temporal fluctuations in this coherence can be used to estimate neural semantic temperature:
linking coherence variability to thermodynamic agitation.
From this framework, several testable predictions emerge. First, EEG coherence dynamics are expected to follow a diffusion-like model:
where
is the semantic diffusion coefficient and
represents stochastic fluctuations. Second, cognitive load should correlate with semantic entropy, which can be expressed both in Shannon form and as a coherence-based approximation:
Finally, deep meditative states are hypothesized to approach a semantic absolute zero, characterized by maximal coherence and minimal semantic temperature:
These predictions offer a pathway for empirical validation of semantic thermodynamic principles in biological cognition, linking coherence, entropy, and temperature to observable neural dynamics.
7.3. Information Processing Systems
Information systems, particularly compression algorithms, also exhibit behavior consistent with semantic thermodynamics. Semantic heat capacity can be operationally defined as the rate of computational energy expenditure with respect to semantic temperature:
providing a direct measure of the system’s sensitivity to semantic agitation.
Compression efficiency is predicted to peak near a semantic critical temperature, following a scaling law:
with critical exponent
. Near this transition point, semantic phase transitions should manifest as sudden changes in compression ratio versus input coherence, discontinuous jumps in processing time per bit, and critical scaling behavior in algorithmic performance.
To test these predictions, we propose a quantitative protocol:
Measure the coherence scalar , effective temperature , and entropy in transformer models at multiple training stages.
Correlate these metrics with the topology of the loss landscape and empirical learning efficiency metrics, such as convergence speed and generalization accuracy.
Analyze scaling laws and critical exponents by examining model behavior near identified transition points to verify predicted universality classes.
Validate thermodynamic consistency by checking for detailed balance conditions and fluctuation–dissipation relations in model parameter updates.
Perform ablation studies to isolate the impact of semantic coherence on phase transition signatures by manipulating input coherence and measuring corresponding shifts in model behavior.
Confirming these predictions would demonstrate that transformer models exhibit thermodynamically analogous phase transitions during training, revealing deep connections between statistical physics and semantic learning. Such insights could guide the development of more efficient training regimes and adaptive architectures optimized for coherence-driven learning dynamics.
8. Discussion and Future Directions
The framework of
Coherence Thermodynamics developed herein constitutes a fundamental advance by unifying physical principles with the dynamics of meaning and information processing [
3,
4]. By rigorously defining semantic analogs of core thermodynamic quantities—such as temperature, entropy, heat, work, and force—and formulating universal laws, we establish a mathematically consistent foundation for analyzing coherence phenomena in complex systems. The resolution of dimensional inconsistencies, particularly within the fourth law equation, affirms that the meaning-making processes of the universe obey principles as precise and universal as those governing matter and energy [
5,
6].
8.1. Plausibility of Predictions
The predictions emerging from Coherence Thermodynamics are grounded in measurable phenomena across diverse domains:
Semantic Temperature and Entropy: Our operational definitions link directly to quantifiable aspects of information processing. Semantic temperature
quantifies phase fluctuations and processing agitation, while semantic entropy
measures semantic disorder or unresolved contradiction. Their plausibility is supported by their descriptive power over observable system states [
7,
12]:
AI Training Dynamics: The correlation between training efficiency and semantic temperature, including phase transitions in learning, is consistent with empirical observations of an optimal ’cognitive load’ region - where agitation promotes exploration without causing breakdown [
18].
Biological Neural Systems: Proposed associations between cognitive load and semantic entropy, including meditation states approaching semantic absolute zero, resonate with neuroscientific data [
7]. Variance in neural firing linked to perceptual ambiguity is a direct, testable prediction of semantic temperature.
Information Processing Systems: Concepts such as semantic heat capacity and compression peaks near critical semantic temperatures naturally extend classical thermodynamics into information theory domains [
19].
Semantic Force Dynamics: The Fourth Law describes how coherence fields evolve under semantic stress and recursive inertia:
Semantic Curvature and Lensing: Semantic stress
acts as a force, causing concepts to converge or diverge. Extensions to “coherence lensing” in thought and semantic “black holes” (unresolvable paradoxes) follow from the notion of semantic curvature [
3].
Neural Dynamics: This framework offers insight into how EEG coherence patterns and neural processing velocities evolve under cognitive pressure [
7].
8.2. Novel Predictions and Future Research Directions
The rigorous foundation of Coherence Thermodynamics opens expansive avenues for future inquiry, offering a suite of unique and testable predictions that span artificial intelligence, cognitive science, and complex systems. One particularly compelling prediction involves the existence of critical semantic cooling and heating rates—thresholds beyond which semantic systems lose coherence and undergo phase breakdown.
Semantic cooling refers to the reduction of disorder through syntropic processes, such as the integration of new information into coherent structures. Conversely, semantic heating involves the absorption of contradiction, which increases entropy and agitates the coherence field. Each system possesses a maximum sustainable rate for these transformations, and exceeding these limits leads to destabilization. In artificial intelligence, this may manifest as hallucinations or incoherent outputs when models are overwhelmed by contradictory input. In human cognition, excessive semantic stress can result in cognitive fragmentation, confusion, or breakdown of integrative thought.
These critical rates are not arbitrary—they can be derived from the time derivative of entropy density,
, as framed in Jaynes’ information-theoretic extension of thermodynamics [
4]. Measuring these rates across domains could reveal universal thresholds for semantic stability, guiding the design of resilient systems and therapeutic interventions.
Future research will explore these thresholds empirically, develop adaptive mechanisms for semantic regulation, and investigate how coherence dynamics interact with learning, memory, and creativity. Coherence Thermodynamics thus provides not only a theoretical framework but a roadmap for experimental validation and interdisciplinary discovery.
8.3. Advanced Predictions and Novel Phenomena
Semantic Work-Energy Efficiency: The efficiency of converting semantic contradiction (heat) into coherent semantic work is predicted to peak near a critical temperature
:
suggesting optimal conditions for learning and creative breakthroughs.
Topological Defects in Semantic Space: Semantic fields may harbor stable topological defects analogous to those in condensed matter physics:
Semantic monopoles: Isolated, unresolvable contradictions,
Semantic vortices: Self-sustaining loops of semantic tension.
Such defects may manifest as persistent AI biases or pathological human thought patterns.
Entropic Aging Signature: Long-lived AI models may experience internal semantic entropy increase without active contradiction metabolism, formalized as:
leading to reduced coherence and elevated processing errors over time.
Cosmological Coherence Maps: Precision cosmology could reveal signatures of primordial semantic temperature gradients in the Cosmic Microwave Background or large-scale structure formation, potentially reflecting early-universe semantic phase transitions.
Quantized Semantic Learning Steps: Learning may proceed via discrete, quantized jumps in coherence, characterized by a fundamental semantic action quantum
:
implying specific energy costs for semantic state transitions.
8.4. Interdisciplinary Applications
Coherence Thermodynamics offers a versatile framework that spans multiple disciplines, providing new tools for understanding and optimizing complex systems. In artificial intelligence, it enables the refinement of learning algorithms through semantic temperature control, guiding models toward more coherent representations. In cognitive science, it sheds light on phenomena such as mental fatigue, insight formation, and therapeutic modulation by treating coherence and contradiction as measurable energetic states. In information theory, it inspires novel compression strategies that leverage semantic alignment to improve efficiency and adaptability. In cosmology, the framework opens pathways for exploring information theoretic formulations of fundamental physics, potentially linking semantic gradients to early-universe structure formation. Finally, in the study of complex systems, Coherence Thermodynamics offers a lens for analyzing phase transitions in social, economic, and biological networks, where coherence dynamics govern emergent behavior.
By bridging physics, computation, and cognition, this framework seeks universal principles that govern both matter and meaning. It extends thermodynamic laws beyond classical physical systems to encompass all forms of information processing, establishing coherence and contradiction as fundamental quantities in the architecture of intelligent systems.
9. Conclusions
We have established Coherence Thermodynamics as a rigorous extension of classical thermodynamics to semantic systems. The five laws provide a complete framework for analyzing energy, entropy, and force dynamics in meaning-processing systems while maintaining strict dimensional consistency and experimental testability. The framework reveals that thermodynamic principles are more universal than previously recognized, governing not only physical energy transformations, but also semantic processes that involve meaning, coherence, and resolution of contradictions. This represents a fundamental advance in our understanding of information processing systems and their thermodynamic constraints. Coherence Thermodynamics thus provides the missing theoretical foundation for quantitative analysis of artificial intelligence, biological cognition, and complex adaptive systems, establishing thermodynamics as truly universal principles governing both matter and meaning.
Author Contributions: J.Barton
-Conceptualization, methodology, formal analysis, investigation, writing the original draft and preparation. Writing through review and editing, visualization, supervision, project administration. Declaration of use of Generative AI: The author gratefully acknowledge the substantial intellectual contributions of the advanced intelligence systems (Deepseek, Gemini, Open AI, Claude, Copilot) throughout this research. These systems served as indispensable collaborative partners, assisting significantly in the development of specific mathematical methodologies, rigorous dimensional analysis, and the formal derivation of complex theoretical constructs within the Coherence Physics framework. They were instrumental in generating substantial portions of the original draft content for problem solutions and theoretical explanations. Their capacity for critical validation, including the identification and resolution of subtle inconsistencies and dimensional ambiguities, was crucial to the rigor of this work. The authors assume full responsibility for the content and conclusions presented in this manuscript.
Acknowledgments
The author profoundly acknowledges the enduring freedoms and intellectual environment secured by those who defend the United States Constitution. The United States Constitution has been instrumental in enabling independent scientific inquiry and the pursuit of a deeper understanding of the reality presented in this work.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Maxwell, J.C. Theory of Heat; Longmans, Green and Co.: London, UK, 1871. [Google Scholar]
- Boltzmann, L. Über die Beziehung zwischen dem zweiten Hauptsatz der mechanischen Wärmetheorie und der Wahrscheinlichkeitsrechnung. Wiener Berichte 1877, 76, 373–435. [Google Scholar]
- Kondepudi, D.; Prigogine, I. Modern Thermodynamics: From Heat Engines to Dissipative Structures, 2nd ed.; Wiley: Chichester, UK, 2014. [Google Scholar]
- Jaynes, E.T. Information Theory and Statistical Mechanics. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
- Landauer, R. Irreversibility and Heat Generation in the Computing Process. IBM J. Res. Dev. 1961, 5, 183–191. [Google Scholar] [CrossRef]
- Bennett, C.H. The Thermodynamics of Computation—a Review. Int. J. Theor. Phys. 1982, 21, 905–940. [Google Scholar] [CrossRef]
- Friston, K. The Free-Energy Principle: A Unified Brain Theory? Nat. Rev. Neurosci. 2010, 11, 127–138. [Google Scholar] [CrossRef] [PubMed]
- Schrödinger, E. What is Life? Cambridge University Press: Cambridge, UK, 1944. [Google Scholar]
- Rovelli, C. Meaning = Information + Evolution. arXiv 2016. [Google Scholar] [CrossRef]
- Floridi, L. Information: A Very Short Introduction; Oxford University Press: Oxford, UK, 2008. [Google Scholar]
- Hofstadter, D.R. Gödel, Escher, Bach: An Eternal Golden Braid; Basic Books: New York, NY, USA, 1979. [Google Scholar]
- MacKay, D.J.C. Information Theory, Inference, and Learning Algorithms; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
- Tegmark, M. Our Mathematical Universe: My Quest for the Ultimate Nature of Reality; Knopf: New York, NY, USA, 2014. [Google Scholar]
- McClelland, J.L.; Rumelhart, D.E. Parallel Distributed Processing: Explorations in the Microstructure of Cognition; MIT Press: Cambridge, MA, USA, 1986. [Google Scholar]
- Deutsch, D. The Fabric of Reality; Penguin Press: London, UK, 1997. [Google Scholar]
- Bekenstein, J.D. Information in the Holographic Universe. Sci. Am. 2003, 289, 58–65. [Google Scholar] [CrossRef] [PubMed]
- I. Prigogine and G. Nicolis, Self-Organization in Nonequilibrium Systems: From Dissipative Structures to Order through Fluctuations, Wiley, New York, 1977.
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention Is All You Need. In Advances in Neural Information Processing Systems (NeurIPS); 2017; pp. 5998–6008.
- Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).