Preprint
Brief Report

This version is not peer-reviewed.

Circuit Theory as a Unifying Framework for Energy and Information

Submitted:

25 August 2025

Posted:

26 August 2025

You are already at the latest version

Abstract
This paper explores the physical relationship between energy and information through the lens of circuits and systems theory. It integrates classical circuit equations with Landauer’s Principle to establish a framework for modelling computation as a dynamic, energy-aware process. By introducing new conceptual mappings-such as the equivalence between voltage and information energy, and a proposed resistance-like quantity for bit-rate—we aim to bridge digital logic and energy systems design.
Keywords: 
;  ;  

1. Introduction

The increasing demand for digital computation has made energy consumption a central concern in system design [1,2,3]. Modern computational systems are now deeply embedded across all sectors of infrastructure—from scientific research and industrial automation to real-time data processing and everyday consumer technologies. Therefore, their energy demands have become both substantial and systemic. Where digital processes were once peripheral to energy planning, they now account for a significant and rapidly growing share of global electricity consumption. This trend not only underscores the scale of challenge but also signals the urgency of rethinking how computation interacts with energy systems.
Despite its growing impact, computation is still modelled as a passive load in most energy frameworks. While recent innovations in energy-aware computing, workload scheduling, and hardware efficiency have made important progress, a deeper scientific question remains unresolved: how can computational processes be formally and physically integrated into the operational logic of electrical systems? In other words, a unified theory that fully explains the relationship between energy and information remains largely undeveloped—both in conceptual clarity and practical application. Current models typically treat computation as a fixed or externally imposed load, abstracting away the internal dynamics of digital processes. As a result, system-level analysis often fails to capture key aspects of computational energy behaviour. Computational demand is shaped by interactions among software, hardware, and environmental conditions, making it far more dynamic than traditional energy models suggest.
This paper proposes a circuit-based approach to understanding the energetic cost of information processing, grounded in thermodynamic principles and electrical laws. It also introduces novel mappings between electrical and informational quantities, offering a foundation for energy-aware computing.

2. Basic Relations in Electrical Circuits

The foundational relationships among electrical quantities are essential for analysing energy behaviour in computational systems. These principles, rooted in classical circuit theory [4], provide the basis for linking physical energy flows to digital processes.
Electric current I quantifies the rate at which electric charge flows through a conductor:
I = d Q d t
where Q is the electric charge in Coulombs, and t is time in seconds. It is is given in Amperes (A). Voltage V measures the energy available per unit of charge:
V = d W d Q
where W is energy in Joules.
Electric power P represents the rate of energy transfer or conversion in a system:
P = d W d t
Energy is typically expressed in Joules (J) or Watt-hours (Wh), with the conversion given by: 1 Wh = 3600 J . Finally, Resistance R characterizes a material’s opposition to current flow and is determined by its physical properties:
R = ρ l A
where ρ is the material’s resistivity, l is the length of the conductor, and A is its cross-sectional area. The unit of resistance is Ohms ( Ω ).
These equations form the analytical backbone for modelling energy consumption in electronic and computational systems, enabling deeper integration of physical laws into digital design.

3. Landauer’s Principle and Information Energy

Landauer’s Principle establishes a fundamental thermodynamic limit on the energy cost of information processing. Specifically, it states that the erasure of a single bit of information in a computational system must dissipate a minimum amount of energy as heat [5]:
E min = k T ln 2 2.871 × 10 21 J
Here, k = 1.380649 × 10 23 J / K is Boltzmann’s constant, and T = 300 K represents room temperature. This principle highlights the physical nature of information: every logical operation that irreversibly alters data—such as resetting a memory cell—has an unavoidable energy cost. As computational systems scale and energy efficiency becomes critical, Landauer’s limit serves as a benchmark for evaluating the thermodynamic efficiency of digital architectures.
To quantify the computational capacity of energy, we define the parameter κ , which represents the maximum number of bits that can be irreversibly erased using 1 joule of energy:
κ = 1 J E min = 1 2.871 × 10 21 3.48 × 10 20 bits
This value provides a theoretical upper bound on information throughput per unit of energy. It is a critical metric for energy-aware computing, as it links thermodynamic constraints to digital performance. Understanding κ allows system designers to assess how close a given architecture operates to the fundamental limits of energy efficiency, guiding innovations in low-power hardware, reversible computing, and sustainable information processing.

4. Energy and Information

As digital systems continue to scale in complexity and energy demand, understanding the relationship between energy and information becomes increasingly vital. Computation is not merely a logical abstraction—it is a physical process governed by thermodynamic constraints. By reinterpreting familiar electrical concepts such as voltage, current, and resistance through an information-theoretic lens, we can develop new metrics that capture the energetic and temporal costs of data processing. This section introduces symbolic analogies that bridge these domains, offering a framework for analysing the efficiency of information flow in energy-aware computing systems.
Bitrate, measured in bits per second (bps), represents the rate at which information is transmitted or processed. In energy-aware computing, bitrate becomes a key metric for evaluating the temporal dynamics and energy efficiency of digital systems.
Voltage, traditionally defined as energy per unit charge, can be reinterpreted in the context of information as energy per bit. Using Landauer’s minimum energy per bit, we define an informational voltage:
L = J κ
This formulation aligns voltage with the energetic cost of processing one unit of information, where κ is the number of bits that can be processed per joule, as derived from Landauer’s limit.
To extend this analogy, we introduce a new quantity called Frith1, denoted B , which parallels electrical resistance but applies to information flow. Frith is defined as the ratio of energy per bit to bitrate:
B = L κ / s = J / κ κ / s = J s κ 2
Here, L is the Landauer energy per bit, and κ / s is the bitrate. The unit of B reflects how energy and time scale with the square of the information volume, offering a novel metric for evaluating the resistance to efficient information processing.
The inverse of Frith is termed Eolas2, denoted E , which captures the system’s capacity to process information efficiently under energy constraints. Table 1 presents a symbolic mapping between classical electrical quantities and their information-theoretic counterparts.

4.1. Example 1 – Communication with Variable Energy per Bit

Assume a system operating at room temperature T = 300 K . The Landauer energy per bit is:
L = k T ln 2 2.8 × 10 21 J
For a communication channel with a bitrate of:
bitrate = 10 9 bit / s
We consider two scenarios for the energy required per bit:
(a) Ideal system: each bit requires exactly one Landauer energy:
E a = bitrate × L = 10 9 × 2.8 × 10 21 = 2.8 × 10 12 J
(b) Non-ideal system: each bit requires 1000 times the Landauer energy:
E b = bitrate × 1000 × L = 10 9 × 1000 × 2.8 × 10 21 = 2.8 × 10 9 J
These results illustrate the dramatic increase in energy consumption when a system operates far from the thermodynamic limit, emphasizing the importance of energy-efficient design in high-throughput communication systems.

4.2. Example 2 – Matrix Inversion and Informational Resistance

Consider the inversion of a 1000 × 1000 matrix performed on a desktop Intel Core i7 processor. This operation involves approximately O ( n 3 ) computations. Assuming double-precision arithmetic (64 bits per operation), the total number of bit-level operations is:
bits = 1000 3 × 64 = 6.4 × 10 10
which is equivalent to 184 p κ . So the Frith is calculated by the total energy divided by the number of κ .
B = 65 184 p κ = 3.53 × 10 1 1 = 353 G B .
Thus, the system exhibits an informational resistance of approximately B 3.53 × 10 11 , meaning it used over 353 billion times more energy per bit than the theoretical minimum. This highlights the gap between current computing technologies and the thermodynamic limit, emphasizing the need for energy-efficient design.

5. Conclusion

This paper has introduced the basic concepts for a unifying theory between energy and information, grounded in thermodynamic principles and inspired by classical electrical analogies. By defining quantities such as L (informational voltage), B (Frith), and E (Eolas), it proposes new metrics for evaluating the energetic and temporal efficiency of digital systems. These constructs offer a novel lens through which computation can be analysed not as an abstract process, but as a physically constrained operation shaped by energy flow and information dynamics.
Future work may pursue several directions. Empirical validation of these metrics across diverse hardware platforms could help assess their practical relevance and limitations. Extending the framework to reversible and quantum computing may uncover deeper connections between energy efficiency and computational architecture. Integrating these symbolic models into simulation environments and workload schedulers could support the design of energy-optimized systems. Additionally, exploring the influence of environmental factors—such as temperature variability and energy source quality—may enhance the adaptability of energy-aware computing in real-world contexts.

References

  1. Patel, Y. S. , Townend, P., Singh, A., and Östberg, P.-O. Cluster Computing, 4095. [Google Scholar]
  2. Ahvar, E. , Orgerie, A.-C., and Lebre, A. IEEE Transactions on Sustainable Computing, 2022. [Google Scholar]
  3. Nazaré, T. , Gadelha, J., Nepomuceno, E., and Lozi, R. IEEE Latin America Transactions, 2023. [Google Scholar]
  4. Alexander, C. K. and Sadiku, M. N. O. Fundamentals of Electric Circuits, 2012. [Google Scholar]
  5. Landauer, R. IBM Journal of Research and Development 5(3), 183–191 July (1961).
1
Irish term meaning opposition or resistance
2
Irish term for knowledge or information
Table 1. Consistent mapping between electrical and information-theoretic quantities, with symbolic notation.
Table 1. Consistent mapping between electrical and information-theoretic quantities, with symbolic notation.
Electrical Quantity Unit Information Quantity Unit
Current Ampere (A) = 1C/s Bitrate κ /s
Voltage Volt (V) Landauer Voltage L J/ κ
Resistance Ohm ( Ω ) Frith ( B ) J s / κ 2
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated