Preprint
Review

This version is not peer-reviewed.

Classical and Quantum Views of Information from the Observer’s Point of View

Submitted:

23 May 2025

Posted:

26 May 2025

You are already at the latest version

Abstract
This brief paper explores classical and quantum entropy through the lens of relativistic observer-dependence. By modeling a qubit-populated spherical system, we analyze entropy as perceived by spatially separated observers, introducing time delay effects in information acquisition. Shannon and von Neumann entropies are formulated as observer-dependent functions, and a framework is proposed for the spatiotemporal propagation of informational entropy. The work emphasizes that entropy, typically viewed as a scalar or global property, acquires field-like behavior when relativistic considerations are included.
Keywords: 
;  

1. Introduction

Entropy is central to classical and quantum information theory. Shannon entropy quantifies uncertainty in a discrete probability distribution [1], while von Neumann entropy generalizes this concept to quantum systems via density matrices [2].
In relativistic contexts, simultaneity becomes relative, and information cannot travel faster than light. Consequently, observers at different space-time coordinates perceive different states of the same system at the same coordinate time. Building on ideas from relativistic quantum information theory [3,4,5], this work presents a formalism in which entropy becomes an observer-dependent, space-time-dependent quantity. We reformulate classical and quantum entropy to include signal delay and define an entropy field over space-time.

2. Entropy in Classical Information Theory

Let A   =   { 0 ,   1 } Be a binary alphabet, and let X Be a discrete random variable over A , with a probability distribution p ( x ) . The Shannon entropy is:
H ( X )   =     p ( x )   l o g   p ( x )
Assume the information source is located at position x_s, and the observer is at position x o , with spatial separation.   r   =   x s   x o .   Given the speed of light c, the observer receives delayed information with a time lag. Δ t   =   r   /   c . Thus, the observer-relative Shannon entropy becomes:
H _ o ( t )   =   H ( X ( t     Δ t ) )

3. Quantum Information: von Neumann Entropy

For quantum systems, the state is described by a density matrix ρ , with T r ( ρ )   =   1 . The von Neumann entropy is:
S ( ρ )   =   T r ( ρ   l o g   ρ )
Let ρ(t) denote the time-evolving density matrix. The entropy as perceived by an observer at position x o is delayed:
S o t   =   S ( ρ ( t     | | x s   x o | |   /   c ) )

4. Time Evolution and Dynamics

4.1. Classical Case: Markov Dynamics

Consider a binary Markov process where the probability p of the system being in state   1   evolves according to:
d p / d t   =   γ ( p     0.5 )
The time-dependent Shannon entropy is:
H ( t )   =   H ( X ( t ) )
Incorporating information delay, the observer-relative entropy becomes:
H o t =   H X t     Δ t

4.2. Quantum Case: Lindblad Evolution

For unitary evolution of a closed quantum system: ρ ( t )   =   U ( t ) ρ ( 0 ) U ( t ) , the von Neumann entropy remains constant if ρ ( 0 ) is diagonal. For open systems, the evolution is governed by the Lindblad master equation:
d ρ / d t   =   i [ H ,   ρ ]   +   D ( ρ )
The perceived entropy evolves as:
S o t =   S ρ t     Δ t

5. Entropy as a Spacetime Field

We define an entropy field E(x, t) over space-time, accounting for relativistic delay:
E x ,   t =     H X t   x     x s c ,   (Classical)
    S ρ t x x s c ,   (Quantum)
This field exhibits discontinuities along null surfaces—i.e., the light cones—referred to as entropy horizons.

6. Entropy Gradient and Information Flux

We define the information flux as the spatial gradient of the entropy field:
j E x ,   t =   E x ,   t
This vector field encodes the spatial rate of change of perceived entropy. The temporal derivative E / t represents the local rate of informational change.

7. Conclusion

By incorporating the finite speed of information transfer, this work demonstrates that both classical and quantum entropy become observer-relative quantities in space-time. The resulting framework leads naturally to the concept of entropy as a dynamical field, with gradients and fluxes constrained by causal structure. These insights may prove valuable in areas such as:
-
Cosmological entropy flows
-
Black hole thermodynamics
-
Relativistic quantum communication
Future work could include coupling the entropy field with curved spacetime metrics and exploring its behavior in general relativistic settings.

Figure representations

Preprints 160711 i001
Here are two visualizations:
  • Entropy Delay Surface: This shows how information delay (Δt) increases with spatial separation from the observer (at the origin). It forms concentric contours, similar to light cones.
  • Observer-Relative Entropy Field: This model perceived entropy as decreasing with distance due to the delay, following an exponential decay for illustrative purposes.

References

  1. Shannon, C. E. (1948). A Mathematical Theory of Communication. Bell System Technical Journal, 27(3), 379–423. [CrossRef]
  2. Nielsen, M. A., & Chuang, I. L. (2010). Quantum Computation and Quantum Information. Cambridge University Press.
  3. Peres, A., & Terno, D. R. (2004). Quantum Information and Relativity Theory. Rev. Mod. Phys., 76(1), 93–123. [CrossRef]
  4. Rovelli, C. (1996). Relational Quantum Mechanics. Int. J. Theor. Phys., 35(8), 1637–1678. [CrossRef]
  5. Sorkin, R. D. (1997). Forks in the Road, on the Way to Quantum Gravity. Int. J. Theor. Phys., 36(12), 2759–2781. [CrossRef]
  6. Gorini, V., Kossakowski, A., & Sudarshan, E. C. G. (1976). Completely Positive Dynamical Semigroups of N-Level Systems. J. Math. Phys., 17(5), 821–825. [CrossRef]
  7. Bekenstein, J. D. (1973). Black Holes and Entropy. Phys. Rev. D, 7(8), 2333–2346. [CrossRef]
  8. Wootters, W. K. (1981). Statistical Distance and Hilbert Space. Phys. Rev. D, 23(2), 357–362. [CrossRef]
  9. Terno, D. R. (2006). Introduction to Relativistic Quantum Information. In Quantum Information Processing: From Theory to Experiment (pp. 61–86). Springer.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated