Preprint
Article

This version is not peer-reviewed.

Deterministic and Stochastic Autocatalytic Growth: Entropy, Bifurcations, and Quantum Extensions for Network Self-Organization

Submitted:

30 July 2025

Posted:

31 July 2025

You are already at the latest version

Abstract
We investigate the dynamics of self-organizing networks through two autocatalytic models: a deterministic growth model and a stochastic extension with bounded noise. Both are governed by the inheritance rule xl +1 = kxl , where each node replicates according to a fixed parameter k. The deterministic model yields exponential growth characterized by a Lyapunov exponent λ = log k, ensuring structural predictability. In contrast, the stochastic model introduces uniform noise and leads to entropy amplification, with an effective Lyapunov exponent λeff ≈ log k + 1/2 log (1 + ϵ2k2). Using time series analysis, we study the evolution of trajectories over discrete inheritance steps and quantify their sensitivity through bifurcation diagrams and entropy metrics. We demonstrate a transition from linear to chaotic growth as k varies, supported by Lyapunov and entropy computations. Additionally, we extend the model into the quantum regime by associating system states with density matrices and computing von Neumann entropy. This unified framework reveals deep connections between network propagation, complexity growth, and quantum information dynamics, with implications for quantum computing, secure communication, and crisis-resilient decentralized systems.
Keywords: 
;  ;  ;  ;  ;  ;  

Dedication

This work is dedicated to the memory of Felix Shmidel, Ph.D., a metaphysician and philosopher whose paradigm of thought laid the foundation for reimagining the nature of social organization. His intellectual legacy, expressed in works such as The Metaphysics of Meaning and Will to Joy, profoundly shaped the conceptual development of this research.

Terminology and Notation Clarifications

To enhance clarity and consistency, we summarize the main terminology and notation used throughout this paper:
  • Inheritance Parameter (k)
    • k N , k 1 denotes the deterministic amplification or replication factor at each discrete iteration.
    • k = 1 corresponds to the critical regime (neutral growth), k > 1 corresponds to autocatalytic exponential growth, and (for completeness) 0 < k < 1 leads to decay.
  • Iteration Index (l) and Node Count ( N ( l ) )
    • l N 0 is the discrete iteration depth.
    • N ( l ) is the total number of nodes at iteration l, evolving deterministically as N ( l ) = k l or stochastically as:
      N ( l + 1 ) = k N ( l ) + η l .
  • Stochastic Perturbation ( η l )
    • η l = ϵ ξ l is the additive noise term, where ϵ > 0 is the noise amplitude.
    • ξ l U ( 1 , 1 ) unless otherwise stated; for Gaussian perturbations, ξ l N ( 0 , 1 ) .
  • Normalized State ( x l )
    • x l = N ( l ) N 0 is the normalized trajectory, used for bifurcation analysis and Lyapunov computations.
  • Lyapunov Exponents
    • Deterministic: λ = log k .
    • Stochastic:
      λ eff log k + 1 2 log 1 + ϵ 2 k 2 .
  • Entropy Measures
    • Shannon Entropy (H): Used for classical uncertainty of trajectories { N ( l ) } .
    • von Neumann Entropy ( S ( ρ ) = Tr ( ρ log ρ ) ) : Used in the quantum generalization where ρ is the density matrix.
  • Bifurcation Terminology
    • Subcritical regime: 0 < k < 1 , exponential decay.
    • Critical regime: k = 1 , marginal stability.
    • Supercritical regime: k > 1 , exponential divergence.
  • Architectural Constraint
    • H ( a , v ) = 0 : denotes the absence of additional coordination or trust assumptions in network propagation.

Main Results

We highlight two core results from our study of deterministic and stochastic autocatalytic growth in network dynamics:
  • Deterministic Growth and Predictability:
    The inheritance rule
    x l + 1 = k x l , x 0 = 1 ,
    yields the closed-form solution x l = k l . For k > 1 , the system enters an autocatalytic regime of exponential divergence. The Lyapunov exponent is exactly:
    λ = log k > 0 ,
    indicating highly predictable yet exponentially expanding dynamics suitable for modeling global knowledge propagation and recursive system design.
  • Stochastic Extension and Entropy Amplification:
    With bounded noise,
    x l + 1 = k x l + ϵ ξ l , ξ l U ( 1 , 1 ) ,
    the system retains autocatalytic growth while exhibiting increased entropy. The effective Lyapunov exponent becomes:
    λ eff log k + 1 2 log 1 + ϵ 2 k 2 ,
    showing that even small noise amplifies the complexity and chaotic potential of the system.
Quantum Computing Development:
We extend the stochastic inheritance model into the quantum domain by representing the evolving system as a quantum mixed state ρ l , with informational complexity measured by the von Neumann entropy:
S ( ρ l ) = Tr ( ρ l log ρ l ) .
Simulations using Qiskit confirm oscillatory and non-monotonic entropy patterns, reflecting transitions between coherence and decoherence. These findings bridge classical autocatalysis and quantum chaos, enabling applications in quantum circuit complexity, secure quantum communication, and entropy-regulated information processing.

Introduction

The dynamics of self-organization, entropy-driven pattern formation, and chaotic behavior in complex systems have gained significant attention across disciplines including physics, biology, and network science. Foundational contributions by Shannon [25] and later extensions by Jaynes [26] and Tsallis [27] have emphasized the centrality of entropy in understanding information and complexity. Yet, most existing models for network growth and collective behavior rely heavily on probabilistic processes, agent-based simulations, or behavioral heuristics. While these models effectively capture emergent properties, they inherently lack deterministic guarantees and often suffer from unpredictability in growth patterns and information diffusion.
Recent advancements in nonlinear time-series analysis—such as recurrence networks, visibility graphs, and symbolic transition mappings—have demonstrated how chaotic dynamics can be embedded within complex topologies [17,22]. However, these approaches are largely retrospective and data-driven, offering tools to analyze existing time series but providing little in the way of predictive or generative principles for network formation.
Parallel research on autocatalytic networks and replicator dynamics has highlighted the critical role of topology in sustaining long-term system evolution [4,8]. Studies such as those by Eigen [4] and Kauffman [8] have shown how self-replication and catalytic closure contribute to organized complexity. Nonetheless, many of these frameworks remain rooted in stochastic reaction kinetics or continuous-state approximations, and lack explicit, deterministic growth laws capable of operating from first principles.
Self-organized criticality (SOC) theory has served as a cornerstone for understanding systems poised at the edge of chaos [17], offering insights into scale-free avalanches and power-law behavior. However, SOC typically models emergent properties without prescribing discrete mechanisms for network development or entropy-driven transitions.
In this work, we propose a novel model for global network self-organization based on deterministic chaos and entropy amplification. Our framework is governed by a mathematically defined inheritance rule where each node replicates its structure according to a fixed parameter k, resulting in exponential network expansion via N ( l ) = k l , where l is the iteration depth. This rule operates without probabilistic rewiring, coordination, or external control—making it particularly suited for applications in autonomous knowledge dissemination, secure data exchange, and infrastructure-free communication systems.
We rigorously formulate this process as a discrete-time dynamical map and derive analytical properties such as structural inevitability, Lyapunov stability under parameter perturbation, and sensitivity quantified by the exact Lyapunov exponent λ = log ( k ) . Bifurcation analysis reveals a critical threshold at k = 1 , delineating regimes of decay and autocatalytic growth.
To evaluate robustness, we extend the model by introducing bounded stochastic noise and compute the effective Lyapunov exponent:
λ eff log k + 1 2 log 1 + ϵ 2 k 2 ,
revealing how entropy increases due to noise can induce complex yet controllable growth. We further explore quantum generalizations of the model by tracking entropy through von Neumann measures [30], demonstrating potential for modeling decoherence and entanglement evolution in quantum systems.
Empirical simulations conducted using DSI Exodus 2.0 [6] confirm the model’s scalability and chaotic robustness. Even modest inheritance parameters (e.g., k = 5 ) enable rapid global coverage, validating the model for real-world communication and learning networks.
In summary, this study introduces a hybrid framework integrating deterministic chaos, entropy theory, and discrete network dynamics. Our findings provide both theoretical guarantees and practical applicability, offering a foundational tool for future developments in stochastic modeling, entropy-controlled network growth, and quantum-aware system design.

1. Deterministic Network Growth Model

The majority of existing models of network growth, such as preferential attachment [1] and random graph processes [5], rely on stochastic dynamics, probabilistic rewiring, or agent-based interactions to explain the establishment of large-scale connectivity. These models have contributed significantly to our understanding of scale-free behavior [1,20], small-world effects [20], and emergent robustness in complex networks [12]. However, determinism and predictability are typically compromised in such frameworks due to inherent randomness.
In contrast, we propose a fundamentally new approach: a deterministic structural inheritance rule that specifies the evolution of world-wide network structure without recourse to randomness, heuristics, or behavioral assumptions. Our model offers an exact, rule-based foundation for network growth, supporting full analytical control over its topological and dynamical evolution.

1.1. The Inheritance Rule

We define a discrete-time deterministic growth law in which each node at time l generates exactly k new nodes at time l + 1 , each of which inherits the structure of its parent node. Let N ( l ) denote the total number of nodes at iteration depth l. The evolution of the network is governed by the exponential law:
N ( l ) = k l , where k N , k 1 , l N 0 .
Here, k is the deterministic inheritance parameter—the constant number of new nodes each existing node gives rise to—and l is the discrete generation or iteration step. This differs markedly from the randomness inherent in classical growth models [1,5], and draws conceptually from deterministic chaos theory [17].

1.2. Topological Interpretation

From the graph-theoretical viewpoint, the construction establishes a rooted level-based tree topology in which nodes at level l link only to their unique parent nodes at level l 1 . Unlike stochastic tree-growth models, our construction is built upon structural replication rather than probabilistic edge addition. Parent node subgraphs are thus recursively duplicated and inherited, yielding highly regular, scalable architectures.
The deterministic rule ensures:
  • Full structural predictability: The topology at any future iteration is entirely predetermined by the initial configuration and the constant parameter k.
  • Scalability without coordination: There is no interaction or communication among nodes, so the model is well-suited to distributed, trust-free environments such as secure knowledge propagation or autonomous data transmission networks [6,11].
  • Parameter-driven complexity: The single control parameter k governs both the topological scale and dynamical divergence, and this relationship can be quantified analytically via Lyapunov exponents and entropy metrics [22,30].

2. Numerical Simulation

We simulate the trajectory x l = k l for various values of k over the interval l = 0 to 5, illustrating the exponential growth behavior of the deterministic inheritance process.
Table 1. Network Size Evolution for k = 50 .
Table 1. Network Size Evolution for k = 50 .
Level l N ( l ) log 10 ( N ( l ) ) Growth Rate
0 1 0.00
1 50 1.70 50.0
2 2,500 3.40 50.0
3 125,000 5.10 50.0
4 6.25 × 10 6 6.80 50.0
5 3.13 × 10 8 8.50 50.0
Table 2. Comparison with Classical Network Models.
Table 2. Comparison with Classical Network Models.
Model Growth Function Predictability Architectural Constraints
Erdős–Rényi P ( k ) e λ λ k k ! Probabilistic None
Barabási–Albert P ( k ) k γ Probabilistic Preferential
Small-World P ( k ) δ ( k k 0 ) Semi-deterministic Rewiring
Our Model N ( l ) = k l Deterministic H ( a , v ) = 0
Table 3. Parameter Sensitivity Analysis.
Table 3. Parameter Sensitivity Analysis.
k N ( 4 ) N ( 5 ) Global Reach Empirical Support
40 2.56M 102.4M Level 5 Consistent
50 6.25M 312.5M Level 5 Facebook: 4.74
60 13.0M 777.6M Level 5 ✓Consistent
Analysis: For k 50 , global connectivity is achieved within 5 iterations, consistent with empirical findings from Facebook’s large-scale network analysis showing an average of 4.74 degrees of separation [2]. This demonstrates the critical role of the inheritance parameter in determining convergence rates and validates our theoretical predictions.
Results confirm:
  • Perfect monotonicity
  • Constant growth rate k
  • Deterministic predictability
  • Structural inevitability
  • Empirical consistency with real-world networks

3. Dynamical Analysis and Chaotic Transitions

Although the deterministic formula N ( l ) = k l governs periodic and predictable exponential network development, it omits key features commonly observed in self-organizing systems such as uncertainty, heterogeneity, and adaptation. In nature and engineered networks, structural evolution is often regulated by environmental noise, topological disruption, or local adaptation [17,19,22]. These dynamics are integral to understanding complexity in neural systems, social communication, and even molecular biology [4,8].
To incorporate such effects while maintaining analytical tractability, we introduce a bounded stochastic perturbation into the inheritance rule, resulting in a hybrid deterministic-stochastic model.

3.1. Noisy Stochastic Model

We enhance the deterministic rule with an additive noise component η l , leading to the recursive relation:
N ( l + 1 ) = k · N ( l ) + η l ,
where:
η l = ϵ · ξ l , ξ l U ( 1 , 1 ) ,
and ϵ > 0 is the noise amplitude controlling the strength of the perturbation. The random variable ξ l introduces uniform fluctuations at each iteration step.
This formulation results in a noisy exponential map [23], preserving the multiplicative backbone of deterministic inheritance while embedding bounded stochasticity. Such perturbations are particularly relevant in modeling:
  • External disturbances: e.g., signal degradation or environmental uncertainty in decentralized systems.
  • Spontaneous variation: arising from contextual or spatial heterogeneity in large-scale systems.
  • Adaptive mutations: deviations from strict inheritance due to feedback, learning, or evolutionary pressures [27,30].

Bifurcation Analysis of the Inheritance Parameter k Without Noise

Our model exhibits globally monotonic exponential growth of a network under the deterministic law N ( l ) = k l . However, such behavior is highly sensitive to the inheritance parameter k. The qualitative dynamics of the system undergo a critical shift at the bifurcation point k = 1 , which marks the boundary between exponential expansion and structural stasis. This motivates a closer examination of the bifurcation structure of the system, extending insights from our previous work on discrete chaotic transitions [21].
Bifurcation theory, originally developed in the context of continuous dynamical systems [17,22], reveals how small changes in control parameters can lead to sudden, qualitative shifts in system behavior. Even within our discrete-time deterministic framework, bifurcation analysis reveals distinct evolutionary regimes for the network:
  • Critical Regime ( k = 1 ): The system grows linearly and remains on the cusp between stagnation and autocatalysis. This threshold serves as a tipping point.
  • Supercritical Regime ( k > 1 ): The network experiences autocatalytic exponential expansion. Each layer significantly increases total node count, ensuring rapid scaling.
Note: The subcritical case ( k < 1 ) is irrelevant and unnecessary in our classical discrete model, as k N and values below unity would imply fractional or decaying inheritance not applicable in this context.
Understanding these regimes is vital for both theoretical formulation and practical network deployment. Structural or cognitive constraints in real-world networks may reduce the effective inheritance factor k, leading to an unintended collapse of growth dynamics. In such settings, the same architectural rule may yield entirely different outcomes depending on whether k remains above the critical threshold. This highlights the importance of maintaining sufficient connectivity at each level of recursive network expansion to ensure sustainable global structure.

Bifurcation Setup

We begin with the base recurrence relation that defines the network size at iteration l:
x l + 1 = k · x l , x 0 = 1 .
Its closed-form solution is:
x l = k l .
For the purpose of bifurcation analysis, we treat the inheritance factor k R as the bifurcation parameter. This yields three qualitative regimes based on the value of k:
  • Subcritical: 0 < k < 1 — The system decays exponentially toward zero, signifying network collapse. (This regime is irrelevant and unnecessary for the classical model since k N , k 1 ).
  • Critical: k = 1 — The system exhibits neutral behavior with constant size x l = 1 for all l. No amplification or decay occurs.
  • Supercritical: k > 1 — The system displays exponential divergence consistent with autocatalytic network growth and widespread connectivity.
To visualize the system’s sensitivity near the bifurcation threshold, we plot x l as a function of k for fixed iteration depth l. These bifurcation diagrams help reveal sharp behavioral transitions and scaling dynamics critical for understanding the qualitative evolution of the system [17,22].
Specifically, we simulate network size after ten discrete inheritance steps ( l = 10 ) as a function of the inheritance parameter k. We compute x 10 = k 10 over a continuous range k [ 0.5 , 2.0 ] , discretized into 500 values. This offers a clear picture of growth sensitivity around the critical point and supports practical decision-making in choosing stable values for k in implementations such as trust-free communication or infrastructureless information systems [6,11].
As illustrated in Figure 1, three distinct dynamical regimes emerge:
  • Subcritical regime ( 0.5 k < 1 ): The system decays exponentially with decreasing network size. For instance, at k = 0.9 , x 10 0.35 , indicating substantial contraction over 10 steps.
  • Critical regime ( k = 1 ): The network size remains invariant across levels, i.e., x l = 1 for all l. This marks a bifurcation threshold separating contraction from growth.
  • Supercritical regime ( 1 < k 2.0 ): The system enters exponential growth. At k = 1.5 , the network expands to x 10 = 57.7 ; at k = 2.0 , it reaches x 10 = 1024 . This confirms the structural inevitability of explosive expansion under deterministic inheritance when k > 1 .
This bifurcation structure emphasizes the sensitivity of the system’s global behavior to the inheritance parameter k. Even slight reductions toward the critical threshold can drastically suppress network propagation within just a few iterations. Such sensitivity has significant implications for design robustness in real-world decentralized systems and for maintaining global reach in resource-constrained or stochastic environments.

Lyapunov Exponents and Sensitivity to Inheritance Dynamics

Motivation

In dynamical systems theory, Lyapunov exponents are central tools used to quantify a system’s sensitivity to initial conditions and to characterize its long-term predictability or chaotic divergence [17,22]. While our model is structurally deterministic, the inheritance parameter k governs not only the rate of network growth but also the system’s stability in the presence of perturbations.
Analyzing the Lyapunov exponent allows us to demarcate sharp transitions between contraction, stasis, and exponential expansion—serving as an analytical proxy for the system’s bifurcation structure. This measure becomes especially important when generalizing the model to include stochastic effects, adaptive mutations, or quantum decoherence [30].

Definition and Mathematical Formulation

Let f : R R be a differentiable function defining a discrete-time dynamical system of the form:
x l + 1 = f ( x l ) .
The Lyapunov exponent λ quantifies the average exponential rate of divergence (or convergence) of nearby trajectories and is defined as:
λ = lim l 1 l j = 0 l 1 log f ( x j ) .
For our inheritance model, where f ( x ) = k x and hence f ( x ) = k , this simplifies to the constant:
λ = log | k | .
This closed-form solution offers an exact measure of sensitivity with respect to the inheritance parameter k, capturing the transition from decay ( λ < 0 ) to neutral ( λ = 0 ) to divergent ( λ > 0 ) behavior. The clarity of this result underscores the value of deterministic modeling for understanding and predicting global network dynamics [22,23].

Stability Interpretation

The sign of the Lyapunov exponent determines the system’s qualitative behavior:
  • When λ < 0 (i.e., 0 < k < 1 ), the system converges to zero; the network contracts over time.
  • When λ = 0 (i.e., k = 1 ), the system is neutrally stable; network size remains constant.
  • When λ > 0 (i.e., k > 1 ), the system exhibits exponential divergence; the network expands rapidly.
This analysis reveals that despite the system’s deterministic simplicity, even small variations in the inheritance parameter k can lead to dramatic changes in long-term behavior. The bifurcation threshold at k = 1 thus serves as a critical boundary between structural collapse and self-sustained exponential growth. The ability to analytically compute the Lyapunov exponent provides a clear criterion for network stability and robustness [23].

Toward Chaotic Extensions

Although the current inheritance system is linear and fully non-chaotic, the Lyapunov framework provides a solid foundation for future extensions involving nonlinearity, stochasticity, or feedback. One natural direction is to introduce nonlinear dynamics through a logistic-type update rule:
x l + 1 = k x l ( 1 x l ) ,
which is known to generate a rich spectrum of behaviors including fixed points, periodic oscillations, and deterministic chaos depending on the value of k[22,23]. In such settings, a positive Lyapunov exponent λ > 0 becomes a definitive marker of chaos—signaling sensitive dependence on initial conditions, phase instability, and unpredictability over long time horizons.
These nonlinear extensions open exciting opportunities for modeling emergent complexity, investigating robustness against noise, or designing secure decentralized protocols based on controlled chaotic behavior. They also point toward deeper applications in quantum chaos and entropy-driven computing [30].

Numerical Visualization

To illustrate the role of the inheritance parameter k, we computed the Lyapunov exponent λ = log ( k ) over 500 evenly spaced values in the interval k [ 0.5 , 2.0 ] . The result is shown in Figure 2.
As expected, the exponent becomes positive as soon as k > 1 , confirming that deterministic growth is not only inevitable but also dynamically expansive under the model’s assumptions. This sharp transition at k = 1 aligns with the theoretical threshold for self-organization and global convergence.

3.2. Bifurcation and Chaos Transitions

In order to study how the system crosses from stable to chaotic regimes as a function of k and ϵ , we perform a bifurcation analysis of the stochastic map. We define the normalized form:
x l + 1 = k x l + ϵ ξ l ,
where x l = N ( l ) N 0 is the normalized population size.
Depending on k and ϵ , the system 1 has various regimes:
  • For k < 1 , the system goes to zero even when it is noisy.
  • For k = 1 , the system is nearly a noisy random walk and fluctuation-dominated.
  • For k > 1 , the deterministic part dominates, but chaotic divergence and random bursts are induced by additive noise.
We determine the effective Lyapunov exponent λ eff in order to estimate the mean rate of divergence:
λ eff log | k | + 1 2 log 1 + ϵ 2 k 2 .
This word demonstrates the way in which noise increases the entropy and instability of the system and induces transitions from organized to random expansion.

4. Fixed Point Analysis of the Noisy Inheritance Model

We consider the noisy discrete-time network growth model defined by the recursive relation:
N ( l + 1 ) = k · N ( l ) + η l ,
where η l = ϵ · ξ l , and ξ l U ( 1 , 1 ) represents a uniformly distributed stochastic fluctuation, while ϵ > 0 controls the amplitude of noise. This formulation models both deterministic inheritance and bounded environmental or contextual randomness.

4.1. Existence and Stability of Fixed Points

In the absence of noise ( ϵ = 0 ), the fixed points of the system are determined by solving:
N * = k · N * ( 1 k ) N * = 0 .
Hence, the system has a unique fixed point at N * = 0 , which is:
  • Stable for 0 < k < 1 ,
  • Marginally stable for k = 1 ,
  • Unstable for k > 1 .
When noise is introduced ( ϵ > 0 ), the concept of a fixed point becomes statistical. We instead examine the long-term average behavior across many realizations to determine whether the system stabilizes, fluctuates, or diverges.

4.2. Numerical Experiment and Results

We perform numerical simulations of the model for ϵ = 0.1 , initial condition N ( 0 ) = 1 , and three representative values of the inheritance parameter:
k { 0.5 , 1.0 , 1.5 } .
Each simulation was run for 100 iterations and averaged over 100 independent realizations.

4.3. Interpretation

Figure 3 confirms the theoretical expectations of the noisy inheritance model under different discrete inheritance intensities:
  • For k = 1 , the system exhibits a bounded stochastic regime. The inheritance and noise balance out, producing a fluctuating but non-divergent trajectory. This reflects a marginally stable phase, similar to a noisy random walk.
  • For k = 5 , the deterministic term k N ( l ) begins to dominate over the noise, resulting in clear upward drift. The average trajectory starts to diverge, though at a slower exponential rate.
  • For higher values such as k = 20 , 50 , 100 , 145 , the system rapidly escapes any bounded region. The exponential growth completely overshadows the noise term, and the average trajectory reflects deterministic divergence. This illustrates the system’s transition to an autocatalytic regime where inherited structure becomes explosively self-amplifying.
These findings highlight the critical role of k as a threshold parameter: when k > 1 , the system enters a regime of irreversible exponential growth. This suggests that in empathic, self-organizing, or trust-independent networks, even modest deterministic amplification can lead to rapid and uncontrolled expansion—unless bounded by structural, cognitive, or environmental constraints.

5. Phase Portraits of the Noisy Inheritance Model

To analyze the temporal structure of system trajectories under different inheritance intensities, we generate phase portraits for the model:
N ( l + 1 ) = k · N ( l ) + η l , where η l = ϵ · ξ l , ξ l U ( 1 , 1 ) ,
with noise amplitude fixed at ϵ = 0.1 and initial condition N ( 0 ) = 1 . We vary the inheritance parameter across a wide range k { 1.5 , 5 , 20 , 50 , 100 , 145 } and simulate each trajectory over 100 iterations.

Interpretation

The phase portraits in Figure 4 illustrate the evolution of the noisy inheritance model across discrete empathic steps l N for several whole-number inheritance rates k { 1 , 5 , 20 , 50 , 100 , 145 } . The behavior diverges dramatically depending on whether the system is near, moderately above, or far above the critical threshold k = 1 .
  • For k = 1 : The system behaves as a noisy random walk. Since the deterministic component is neutral (no growth or decay), the evolution is dominated by fluctuations due to η l U ( ϵ , ϵ ) . This reflects marginal stability and high sensitivity to initial conditions and noise.
  • For k = 5 : We observe the beginning of exponential divergence. After a brief initial plateau, the deterministic amplification outweighs noise, leading to rapid nonlinear growth in N ( l ) .
  • For k 20 : The model enters a strongly supercritical regime. Growth becomes explosive within a few empathic steps. The additive noise becomes negligible compared to the exponential term, and the system saturates numeric space extremely quickly.
  • For large values like k = 50 , 100 , 145 : The trajectory reaches astronomical scales by iteration l = 100 , revealing the system’s autocatalytic inevitability. Even minimal empathic inheritance quickly escalates to global saturation, modeling the diffusion potential of recursive trust-irrelevant systems.
These results reinforce the mathematical claim that any k > 1 generates unavoidable divergence under deterministic inheritance. The higher the value of k, the earlier this divergence occurs — which is critical when designing systems for information spread, epidemic containment, or crisis-resilient communication. The portraits also demonstrate that despite the stochastic term η l , the qualitative dynamics are dominated by the inheritance law once k exceeds unity.
This supports the claim that deterministic inheritance—when not counterbalanced by saturation or regulatory dynamics—leads to autocatalytic growth with critical implications for both network design and cryptographic applications [4,11].

5.1. Implications for Self-Organizing Networks

Adding noise introduces critical behavior and complex dynamics with phase transition similarities into self-organizing networks [4,17]. This type of transition is extremely significant in:
  • Knowledge diffusion with uncertainty: where message copying is occasionally corrupted [12].
  • Crisis networks: where speed is important even when there is signal noise and corrupt links [19].
  • Biological or neural systems: where development is active but not necessarily deterministic [5,22].
Such a stochastic extension renders the model more realistic without changing its mathematical form. It also provides the foundation for entropy-based statistical analysis, and we talk about it in the following section.

6. Noisy Inheritance Model Bifurcation Analysis

To study the onset of chaos in the self-organizing networks context, we examine a noisy version of our proposed deterministic model. The system is defined by the evolution rule:
N ( l ) = k l + η l ,
where k [ 0.5 , 4.0 ] is the inheritance parameter and η l U ( ϵ , ϵ ) is an amplitude ϵ = 0.1 uniform noise component. This formula entails both deterministic core of inheritance-based growth and random perturbations arising in the dynamics of real networks, like spontaneous contacts or environmental fluctuations.
Figure 5 illustrates a definite qualitative trend with changing regimes of the inheritance parameter k. For subcritical cases (with k < 1 ), the term k l is exponentially small, so stochastic noise becomes the dominant dynamics. This results in a horizontal band of values clumped around zero, i.e., network size fluctuates around small scales solely due to randomness.
In the crossover point k = 1 , there is a dynamical threshold. Here, the deterministic component grows linearly with l, to counteract the influence of noise and create a perturbation- and initial-condition-sensitive system—a hallmark of self-organized criticality.
In the supercritical regime ( k > 1 ), the deterministic growth k l faster and faster dominates, and N ( l ) diverges. Due to the extremely rapid growth, the values on the plot are outside the range of visualization, creating the impression of the disappearance of structure in the bifurcation plot. Such a limitation necessitates log-scaling or normalization in order to show the whole dynamic complexity.
More generally, this bifurcation analysis exhibits a phase transition in network growth dynamics under the influence of the inheritance factor k, exhibiting areas of randomness, criticality, and deterministic growth. This effect is well away from the traditional probabilistic or equilibrium paradigm, emphasizing the emergent chaotic features introduced by our stochastic perturbation and deterministic inheritance model.

6.1. Bifurcation Behavior in the Low-k Regime and Implications for Image Encryption

In this experiment, we investigate the dynamics of the noisy exponential growth model
N ( l ) = k l + η l , η l U ( ϵ , ϵ ) ,
in the low-inheritance regime where k [ 0.05 , 0.9 ] . Here, ϵ = 0.1 controls the uniform noise amplitude, and the simulation is executed over 30,000 iterations, of which the final 100 values of N ( l ) are plotted for each value of k, sampled over 500 evenly spaced steps. The resulting bifurcation diagram is shown in Figure 6.
The graph indicates a dense, irregular point set distribution across the entire range of k, with no observable periodic windows—high entropy and inheritance parameter sensitive dependence being the hallmark [21]. Such chaotic behavior, particularly the randomness and unpredictability of N ( l ) , are highly suggestive of patterns employed in chaotic-based image encryption methods [23]. Within such schemes, pixel values are typically permuted or mapped based on trajectories generated by chaotic maps, in which small variations in parameters lead to extensive state divergence—offering cryptographic diffusion and confusion [22].
By controlling k and the noise level ϵ , the system’s entropy features can be varied and hence tailored to encryption requirements such as key sensitivity, avalanche effect, and resistance to statistical effects [11]. The analogy of the current system’s behavior with classical models of encryption like the logistic or tent map opens a door to the use of this exponential inheritance model as a new hope for secure communication schemes [7].

7. Entropy Analysis of the Noisy Inheritance Model

Entropy serves as a central concept in characterizing disorder and unpredictability in complex systems [5,17]. In the context of stochastic discrete-time dynamical systems, such as our noisy inheritance model,
N ( l + 1 ) = k · N ( l ) + η l , η l = ϵ · ξ l , ξ l U ( 1 , 1 ) ,
Shannon entropy offers a quantitative measure of how dispersed or uncertain the system’s evolution becomes due to the combined effects of deterministic amplification and random perturbations [12].
In particular, Shannon entropy:
  • Measures the distribution of outcomes over time in terms of informational uncertainty;
  • Helps assess how quickly and unpredictably a system evolves under variation in parameters;
  • Provides insight into the cryptographic or communication-theoretic potential of a given process [11,23].
We compute the entropy of trajectories { N ( l ) } for various integer values of the inheritance parameter k over 100 iterations using a logarithmic binning of the transformed values log ( 1 + N ( l ) ) . The probability distribution of occurrences across bins is used to calculate the Shannon entropy H = p i log 2 p i [21].

Interpretation

Figure 7 provides a compelling visualization of how unpredictability—quantified via Shannon entropy—varies with inheritance intensity:
  • Low k regime (e.g., k = 1 ): Entropy remains low ( 4.95 bits), consistent with marginally stable dynamics where the additive noise has only moderate impact and the system remains in a weakly fluctuating state.
  • High k regime ( k 20 ): Entropy increases sharply and plateaus around 5.62 bits. This reflects a transition into stochastic divergence, where deterministic growth is fast enough to amplify the effect of even small noise, producing a highly disordered trajectory.
These findings validate the bifurcation-based and Lyapunov analysis [17,22,23]: once k > 1 , the deterministic structure expands exponentially, and the presence of even bounded randomness leads to high entropic complexity. Importantly, this plateau also implies a potential maximum entropy under fixed noise amplitude ϵ , supporting the idea that k can act as an entropy-control knob.
This behavior aligns with known principles in secure communication and chaotic encryption [25,29]. Systems with high entropy trajectories exhibit superior diffusion and confusion properties, making the noisy inheritance model applicable to domains requiring controlled unpredictability.

8. Quantum Entropy Extension of the Noisy Inheritance Model

The noisy inheritance model’s sensitivity to initial conditions in the system and amplification of perturbations, quantified by classical Shannon entropy, suggests an extension to quantum systems. More specifically, the sensitivity of the system’s initial conditions and the amplification of disturbances for the model render it well adapted to explaining entropy increase in quantum computation and quantum thermodynamics [25,26,30].
To this end, we consider how the model can be interpreted in a quantum setting with the aid of quantum entropy, namely von Neumann entropy:
S ( ρ ) = Tr ( ρ log ρ ) ,
where ρ is the density matrix for a quantum system encoding the evolving state N ( l ) . This entropy is the quantum analogue of Shannon entropy and carries with it both classical uncertainty and quantum entanglement [29].

Conceptual Mapping

We recommend depicting the state N ( l ) as the occupation number or amplitude of a quantum register:
  • The deterministic update N ( l + 1 ) = k N ( l ) can be expressed in terms of a unitary scaling operator (e.g., a quantum walk or controlled-phase shift).
  • The random component η l may be introduced via controlled quantum randomness—either via measurement feedback or noisy channel.
  • The density matrix ρ l produced at each step corresponds to the statistical mixture or entangled evolution of the system [30].

Entropy Measurement and Interpretation

Computing S ( ρ l ) at each empathic step l, we can:
  • Track the rise in quantum informational complexity in comparison to classical entropy plots;
  • Identify phase transitions where quantum action switches from coherence to decoherence;
  • Compare if entropic saturation takes place under bounded quantum noise, just like the Shannon entropy plateau in Section Figure 7 [25,29].
This correspondence allows the inheritance model to be studied as a quantum dynamical system—providing applications in:
  • Quantum circuit complexity,
  • Modeling decoherence in open quantum systems [30],
  • Quantum chaos and information scrambling [28],
  • Quantum cryptography where entropic unpredictability is vital [27].
This quantum generalization of the classical model unlocks the door to simulate trust-insensitive or empathic dynamics in systems of quantum communication and computation. Future work will be comprised of applying the model to quantum simulators (e.g., Qiskit, Cirq) and measuring entropy via partial trace operations in order to isolate S ( ρ ) through time.

9. Theoretical Computation of von Neumann Entropy via Quantum Chaos Approach

To extend the entropy calculation of our noisy inheritance model to the quantum domain, we reformulate the recursive dynamics
N ( l + 1 ) = k N ( l ) + η l , η l = ϵ · ξ l , ξ l U ( 1 , 1 )
as a quantum dynamical system driven by deterministic amplification and stochastic perturbation. In this theoretical extension, we use notions of quantum chaos and quantum information theory to derive an entropic approach suitable to study the complexity evolution of the system in Hilbert space [26,28,30].

Quantum-State Representation and Mixed Dynamics

We begin by casting the classical trajectory N ( l ) into a quantum state vector in a discrete Hilbert space:
| ψ l = i = 1 n α i ( l ) | i ,
where the amplitudes α i ( l ) are normalized complex coefficients representing occupation levels over quantized states | i , which correspond to bins or energy-like levels of the system. The deterministic component k N ( l ) is expressed in terms of a unitary operator U ^ k , and the stochastic perturbation η l through a decohering noise channel N ϵ , e.g., a depolarizing or amplitude-damping map [30].
Each time step results in a mixed state in the form of a density matrix:
ρ l = j p j ( l ) | ψ j ( l ) ψ j ( l ) | ,
corresponding to an ensemble of possible system evolutions with respective probabilities p j ( l ) [25,29].

Entropy via Spectral Decomposition

The von Neumann entropy of the system at time step l is:
S ( ρ l ) = Tr ( ρ l log ρ l ) ,
which can be computed by diagonalizing ρ l and summing its eigenvalues { λ i ( l ) } :
S ( ρ l ) = i = 1 n λ i ( l ) log λ i ( l ) .
In the classical limit when ρ l is diagonal and purely stochastic, this reduces to the Shannon entropy [25]. However, in the quantum situation, coherence and entanglement create off-diagonal structure and lead to an increase in entropy due to superposition and interference [27,30].

Connection to Quantum Chaos and Complexity Growth

The entropy dynamics under such combined dynamics is closely related to results on quantum chaos. As shown by Zurek and Paz [30], and more formally proved in recent works on quantum ergodicity and operator spreading, quantum chaotic systems exhibit linear increase of entropy at short times, saturating to a value determined by the effective dimension of the Hilbert space:
S ( ρ l ) log d ,
where d is the dimension of the Hilbert space, here a function of the number of occupation bins used in discretizing N ( l ) .
In this context, the entropy plateau of the classical model is the analogue of the quantum entropy saturation limit, where the system is maximally mixed within its accessible subspace.

Theoretical Implications

This theoretical advance implies that:
  • The von Neumann entropy in the quantum formulation of the inheritance model captures both phase coherence loss and statistical disorder;
  • The point of saturation and entropic rate of increase reflect the trade-off between quantum noise N ϵ and deterministic inheritance k;
  • The model belongs to a family of quantum stochastic processes where dynamical entropy characteristics are evident, with applications in quantum thermodynamics and scrambling [27,30].
Thus, our model connects discrete-time classical chaos and quantum information complexity. It opens up the possibility of future quantum simulation using open-system quantum maps or Lindblad dynamics, where S ( ρ l ) can be directly measured from quantum circuits via partial trace operations or tomography [25,29].

Analysis of the Plot

The provided plot, titled Von Neumann Entropy in Noisy Inheritance Circuit, offers a compelling view into the behavior of quantum information under deterministic operations and depolarizing noise. It illustrates how the von Neumann entropy of a single qubit (after tracing out the second) evolves over a series of “empathic steps” simulating a quantum inheritance process.

Figure Caption

The entropy plot reveals a non-monotonic and oscillatory trajectory, highlighting an intricate interplay between unitary quantum operations and decoherence. Initially increasing, entropy peaks, then dips significantly, and subsequently rises again—demonstrating that quantum state mixedness is shaped by more than cumulative noise [30].
Figure 8. Von Neumann entropy of a single qubit (after partial trace) across 10 empathic steps in a noisy quantum inheritance circuit. Simulation parameters: num_qubits = 2, depth = 10, epsilon = 0.1 (depolarizing noise), and inheritance_angle = π / 4 . Entropy calculated using Statevector, DensityMatrix, and partial_trace in Qiskit.
Figure 8. Von Neumann entropy of a single qubit (after partial trace) across 10 empathic steps in a noisy quantum inheritance circuit. Simulation parameters: num_qubits = 2, depth = 10, epsilon = 0.1 (depolarizing noise), and inheritance_angle = π / 4 . Entropy calculated using Statevector, DensityMatrix, and partial_trace in Qiskit.
Preprints 170452 g008

Interpretation of Phases

Initial Rise (Layers 1–3)

Entropy increases from approximately 0.13 to 0.71 bits due to two effects:
  • Entanglement: The cry gate entangles qubits; tracing one out results in a mixed state.
  • Decoherence: Depolarizing noise with ϵ = 0.1 contributes additional mixedness [30].
Entropy then drops sharply, reaching near-zero values at layers 6 and 7. Despite ongoing noise, the gate sequence temporarily purifies the subsystem—likely due to specific interference patterns or periodic disentanglement induced by ry and cry gates [30].

Subsequent Rise (Layers 8–10)

Entropy rises again toward 0.70 bits. This suggests that the temporary purifying effect is overtaken by the accumulating impact of noise and quantum gate dynamics.

How This Develops Quantum Computing

Understanding Quantum Noise and Decoherence

This analysis demonstrates how quantum entropy can be used to:
  • Characterize quantum noise through entropy patterns [30];
  • Inspire new error correction techniques leveraging gate-induced coherence.

Quantum Information Processing and Entanglement

By quantifying entanglement and mixedness via entropy:
  • We gain insights into entanglement as a computational resource [29,30];
  • The results inform protocols for secure quantum communication [25,30].

Quantum Simulation and Modeling Complex Systems

The noisy inheritance circuit analogizes the classical stochastic model from the original paper [30].
  • It offers a quantum lens on network self-organization and complexity;
  • Demonstrates how entropy oscillations reflect emergent behavior akin to classical chaos.

Algorithm Design and Optimization

Entropy patterns help optimize algorithm design:
  • Indicate depth levels where pure or mixed states are prevalent;
  • Help design circuits tuned for specific entanglement or coherence properties.
In conclusion, this plot offers a practical bridge between theoretical quantum entropy and real-world quantum circuit behavior—grounding the stochastic inheritance model in experimentally meaningful entropy dynamics.

12 Comparison Between Deterministic and Stochastic Models: Implications for Predictability and Quantum Computing

In this final section, we summarize the key differences between the deterministic inheritance model N ( l ) = k l and its stochastic extension N ( l + 1 ) = k N ( l ) + η l , where η l = ϵ · ξ l introduces bounded noise. This comparison not only highlights theoretical properties such as predictability and entropy but also reveals practical implications for emerging fields like quantum information theory and chaos-based cryptography.
Table 4. Comparison of Deterministic and Stochastic Autocatalytic Models.
Table 4. Comparison of Deterministic and Stochastic Autocatalytic Models.
Aspect Deterministic Model ( N ( l ) = k l ) Stochastic Model ( N ( l + 1 ) = kN ( l ) + η l )
Predictability Fully deterministic with closed-form expression; exact trajectory known at all steps Predictability is statistical; randomness introduces uncertainty in trajectory
Lyapunov Exponent λ = log k : exact rate of divergence λ eff log k + 1 2 log 1 + ϵ 2 k 2 : includes entropy amplification by noise
Bifurcation Behavior Sharp transition at k = 1 between decay, stasis, and exponential growth Complex transitions emerge; noise shifts and softens bifurcation boundary
Entropy (Shannon) Low entropy due to predictable evolution High entropy; saturates near 5.62 bits for large k
Quantum Generalization Not easily mappable to quantum systems Quantum analog defined via von Neumann entropy S ( ρ ) ; suitable for quantum simulation
Implications for Quantum Computing Limited relevance to quantum applications Supports modeling of quantum decoherence, circuit complexity, and entropy saturation in Qiskit
Cryptographic Applications Poor candidate for encryption due to low unpredictability Suitable for chaos-based encryption; parameter sensitivity yields strong diffusion/confusion properties
Empirical Robustness Matches theoretical predictions precisely; implemented in DSI Exodus 2.0 Robust under noisy conditions; supports entropy-controlled and scalable network propagation
The deterministic model provides analytical clarity and scalability under idealized conditions, serving as a foundational tool for understanding structural propagation. However, the stochastic model captures a richer spectrum of real-world behavior—including entropy growth, bifurcation smoothing, and chaotic transitions. These features make it particularly applicable in simulating quantum systems, analyzing noise resilience, and designing cryptographic schemes [22,25,30].
As demonstrated through Lyapunov and entropy analysis, the stochastic inheritance rule serves not only as a generalization but as a bridge toward understanding quantum information processes and complexity growth. Future research may extend these models into fully quantum-mechanical frameworks, leveraging their sensitivity and unpredictability for practical applications in quantum technologies and decentralized systems [23,24,30].
Our findings provide a deterministic alternative to probabilistic network models, establishing a novel framework for self-organizing systems. Unlike agent-based and stochastic models that rely on probabilistic interactions and emergent coordination [1,5,12], our model introduces a closed-form deterministic propagation law N ( l ) = k l . This bridges discrete-time network dynamics with classical theories of autocatalytic growth [4,16], and offers analytical clarity, predictability, and topological simplicity.
Importantly, our deterministic model allows for direct control of growth through the inheritance parameter k, as shown through bifurcation and Lyapunov analysis. In contrast, stochastic models typically require statistical averaging and exhibit noisy, less predictable trajectories. Through our work, we show that this deterministic structure not only matches empirical observations [2,10], but also provides the groundwork for a quantum-compatible framework—via entropy, von Neumann analysis, and quantum noise modeling [22,23,30]. This positions our model as a viable foundation for simulating self-organizing dynamics in quantum information systems and quantum computing [24,30].
Furthermore, the architectural constraint H ( a , v ) = 0 introduces a novel security paradigm. By embedding behavior directly into the network’s structure, we eliminate the need for trust verification or cryptographic keys. This marks a paradigm shift from trust-based to trust-irrelevant networks, aligning with recent advances in distributed, infrastructure-free communication [9,11].

10. Time Series Analysis of the Stochastic Inheritance Model

We now perform a time series analysis of the stochastic inheritance model governed by the recursive relation:
x l + 1 = k x l + ϵ ξ l , ξ l U ( 1 , 1 ) ,
where k > 1 controls deterministic amplification and ϵ > 0 sets the noise amplitude. This model exhibits both exponential growth and randomness, making it suitable for time series characterization techniques. We focus on key statistical features such as stationarity, autocorrelation, and spectral density under various values of k and ϵ .

10.1. Simulated Time Series

We simulate the time series { x l } for 100 iterations using different values of the inheritance parameter k and fixed noise amplitude ϵ = 0.1 . The initial condition is set to x 0 = 1 . For each configuration, we generate 100 realizations and compute the mean trajectory.
Figure 9. Simulated trajectories of the stochastic inheritance model for k = 5 , ϵ = 0.1 . The gray lines show individual realizations; the red line indicates the mean over 100 trials.
Figure 9. Simulated trajectories of the stochastic inheritance model for k = 5 , ϵ = 0.1 . The gray lines show individual realizations; the red line indicates the mean over 100 trials.
Preprints 170452 g009

11. Autocorrelation Analysis for the Stochastic Inheritance Model

To quantify memory and persistence in the stochastic inheritance model, we analyze the autocorrelation function (ACF) for a single realization of the process:
x l + 1 = k x l + ϵ ξ l , ξ l U ( 1 , 1 ) ,
with parameters k = 5 , ϵ = 0.1 , and initial condition x 0 = 1 . We simulate the process for L = 100 iterations and compute the ACF up to lag 20 using the FFT-based method.

Discussion and Analysis

As shown in Figure 10, the autocorrelation decays sharply after just a few lags, indicating a lack of long-term memory in the process. This behavior is consistent with the exponential amplification driven by the deterministic term k x l , which rapidly overshadows past values. The noise term ϵ ξ l , although bounded, contributes only marginally to persistence due to the dominance of deterministic growth.
This confirms that for values of k > 1 , the system transitions quickly to a regime dominated by growth rather than recurrence or feedback, and the time series becomes essentially nonstationary. These results align with the entropy and Lyapunov-based interpretations developed in earlier sections.[31,34]

12. Dynamical Properties: Time Series Analysis with Gaussian Noise

To further investigate the behavior of our stochastic autocatalytic model, x l + 1 = k x l + ϵ ξ l , we performed numerical simulations incorporating Gaussian noise. In this setup, ξ l is sampled from a standard normal (Gaussian) distribution with a mean of zero and a standard deviation of one, and is scaled by the noise amplitude ϵ . [36]

12.1. Simulation Setup

We simulated 100 independent trajectories, each for 50 time steps. The inheritance parameter was set to k = 1.1 , and the noise amplitude was ϵ = 0.1 . All trajectories commenced from an initial value of x 0 = 1.0 .

12.2. Time Series Plot (Figure 1: Stochastic Inheritance Model: Gaussian Noise Trajectories)

The generated plot (Figure 11) displays a selection of 10 representative trajectories.

12.3. Commentary and Analysis

The time series plot reveals several key characteristics of the model’s dynamics under the influence of Gaussian noise:
  • Underlying Exponential Growth: Consistent with the deterministic component of the model ( k = 1.1 > 1 ), all trajectories exhibit a clear trend of exponential growth. This reinforces the autocatalytic nature of the system, where the current state contributes multiplicatively to the subsequent state.[33,35]
  • Significant Divergence Due to Gaussian Stochasticity: While the growth trend is shared, the individual trajectories diverge substantially from one another as time progresses. The use of Gaussian noise, which allows for theoretically unbounded deviations (though with decreasing probability), contributes to a potentially wider spread of outcomes compared to strictly bounded uniform noise, especially at later time steps when the k x l term amplifies even small initial noise perturbations. This highlights the inherent unpredictability in the exact future state of any given realization.
  • Path Dependence and Amplification of Noise: The divergence demonstrates a strong path dependence: minor differences introduced by the random noise in early steps are significantly amplified by the exponential growth mechanism, leading to widely disparate values of x ( l ) in later steps. This property is crucial for understanding how small, random fluctuations can lead to large-scale differences in the long-term evolution of self-organizing systems.
  • Implications for Chaos and Complexity: For these parameters, the system does not converge to a bounded attractor; rather, it continuously grows with increasing variability. The ’chaos’ in this context arises not from boundedness and recurrence but from the extreme sensitivity to initial conditions and the stochastic element, leading to a complex and unpredictable ensemble of possible trajectories. This behavior is fundamental to understanding emergent complexity in systems where deterministic growth interacts with random influences.
This analysis visually supports the role of stochasticity in shaping the dynamics of autocatalytic growth, providing empirical context for the theoretical discussions of entropy amplification and network self-organization within our framework.

Conclusion

In this work, we developed and analyzed two autocatalytic models for global network self-organization: a classical deterministic model and its stochastic extension incorporating bounded noise. Both are governed by the inheritance law x l + 1 = k x l , which yields exponential growth when k > 1 . The deterministic model offers a closed-form solution and full predictability, characterized by the Lyapunov exponent λ = log k , making it well-suited for analytical modeling of structural propagation [4,16,24].
However, real-world networks are inherently noisy, and pure determinism fails to account for uncertainty, adaptability, and quantum effects. To bridge this gap, we introduced a stochastic inheritance model with additive bounded noise. The system exhibits a modified Lyapunov exponent:
λ eff log k + 1 2 log 1 + ϵ 2 k 2 ,
capturing how randomness influences divergence, complexity, and entropy [22,23,25]. This formulation demonstrates that even small noise contributions can dramatically increase system disorder, reflecting phenomena analogous to classical chaos and information scrambling [30].
Using bifurcation diagrams, entropy profiles, and Lyapunov analysis, we showed that the stochastic model maintains scalability while exhibiting critical features of self-organization, such as sensitivity, unpredictability, and entropy saturation. We further proposed a quantum extension of the model based on von Neumann entropy S ( ρ ) , mapping the inheritance dynamics into density matrices and quantum channels. This framework enables simulation of entanglement growth, decoherence, and entropy oscillations in noisy quantum circuits [24,30].
Thus, our contributions go beyond a classical analysis—we provide a unified model that spans deterministic, stochastic, and quantum regimes. By incorporating architectural constraints such as H ( a , v ) = 0 and validating the model in the real-world context of DSI Exodus 2.0 [6,9], we demonstrate both theoretical rigor and practical applicability.
We conclude that the stochastic inheritance model—augmented by quantum formalism—is a powerful candidate for future developments in self-organizing systems, quantum computing, and chaos-based secure communication. Its unique ability to balance structural determinism with entropic unpredictability positions it as a foundational framework for scalable, decentralized, and trust-irrelevant architectures.

Future Work

This study lays a foundational framework for analyzing deterministic and stochastic inheritance models in the context of network self-organization, entropy dynamics, and quantum information theory. Several promising directions for future research emerge from our findings:
  • Full Quantum Simulation: Building upon the von Neumann entropy framework introduced here, future work should develop complete quantum circuit implementations of the noisy inheritance model. Using platforms like Qiskit or Cirq, one can empirically simulate decoherence, entanglement growth, and entropy oscillations under varying values of k and noise amplitude ϵ , validating theoretical predictions via quantum tomography.
  • Quantum Entropy Control: We plan to investigate the use of inheritance parameters as entropy control knobs in quantum cryptographic systems. The goal is to design circuits where the parameter k dynamically tunes information scrambling and entropic unpredictability, supporting applications in secure quantum communication and random number generation.
  • Lindblad and Open-System Modeling: A natural extension involves expressing the stochastic inheritance dynamics within the Lindblad master equation framework. This would allow us to formally analyze entropy production, decoherence rates, and steady-state behavior in open quantum systems subject to continuous environmental noise.
  • Algorithmic Implications: The observed entropy patterns suggest new strategies for quantum algorithm design, including layer optimization based on entropy thresholds and identification of decoherence-resilient gate sequences. These could contribute to developing more robust quantum machine learning and distributed quantum computing protocols.
  • Hybrid Classical-Quantum Systems: Our model can serve as a template for hybrid architectures where classical exponential growth drives quantum state evolution. This dual-domain approach could be useful for simulating complex systems such as biological morphogenesis, swarm robotics, or socio-economic dynamics under uncertainty.
  • Application to Crisis Communication and Social Infrastructures: Future research should expand empirical testing of DSI Exodus 2.0 in real-world scenarios. Specifically, the impact of architectural constraints and inheritance laws on network resilience, speed of knowledge propagation, and failure recovery in emergency contexts warrants comprehensive modeling and deployment.
  • Entropy-Based Model Validation: We intend to use entropy measures to benchmark and validate other models of network growth and self-organization, comparing them to our proposed inheritance model under deterministic, stochastic, and quantum regimes. This will help identify universal patterns across disciplines.
These directions aim to deepen the theoretical and practical impact of our work, bridging mathematical modeling, quantum information science, and decentralized system design.

Data Availability Statement

The theoretical formulations, simulation code, and numerical data used to support the findings of this study—including bifurcation diagrams, entropy computations, and quantum circuit simulations—are available from the corresponding author upon reasonable request. In particular, the development of quantum computing applications based on the stochastic inheritance model requires further evaluation of entropy dynamics and sensitivity to noise parameters. To facilitate future research in this direction, we encourage replication and extension of our entropy-based analyses, including both classical Shannon entropy and quantum von Neumann entropy calculations. The Qiskit-based implementation of the quantum version of the model is also available for researchers interested in modeling decoherence, entanglement, and entropy growth in quantum circuits. These resources aim to promote transparency and foster continued investigation into stochastic models as foundational components of quantum-aware network self-organization.

Acknowledgments

The author thanks Andrey Tyurikov for technical insight, Sergey Shchegolkov for interface contributions, and Olga Cherkashina and Alexandra Glukhova for their support throughout this research.

Conflicts of Interest

The authors declare that there is no conflict of interest regarding the publication of this work.

References

  1. Barabási, A.-L. and Albert, R. (1999). Emergence of scaling in random networks. ( 286(5439), 509–512. [PubMed]
  2. Boldi, P. , Rosa, M. ( 2012). Four degrees of separation. In Proceedings of the 4th Annual ACM Web Science Conference, 33–42.
  3. Dunbar, R. I. (1992). Neocortex size as a constraint on group size in primates. Journal of Human Evolution.
  4. Eigen, M. (1971). Self-organization of matter and the evolution of biological macromolecules. Naturwissenschaften.
  5. Erdos, P. and Rényi, A. (1960). On the evolution of random graphs. ( 5, 17–61.
  6. DSI Exodus 2.0 Technical Documentation (2024). Available at: https://github.com/exodus-social/organizer.
  7. Hirsch, M. W. , Smale, S., and Devaney, R. L. (2013). Differential Equations, Dynamical Systems, and an Introduction to Chaos.
  8. Kauffman, S. A. (1993). The Origins of Order: Self-Organization and Selection in Evolution.
  9. Lubalin, A. (2024). Discovery of the law of autocatalytic inevitability: A new natural law governing network self-organization. Zenodo. [CrossRef]
  10. Milgram, S. (1967). The small-world problem. Psychology Today.
  11. Nakamoto, S. (2008). Bitcoin: A peer-to-peer electronic cash system. Available at: https://bitcoin.org/bitcoin.pdf.
  12. Newman, M. E. (2003). The structure and function of complex networks. SIAM Review.
  13. Ostrom, E. (1990). Governing the Commons: The Evolution of Institutions for Collective Action.
  14. Shmidel, F. (1999). The Metaphysics of Meaning.
  15. Shmidel, F. (2012). Will to Joy.
  16. Simon, H. A. (1962). The architecture of complexity. Proceedings of the American Philosophical Society.
  17. Strogatz, S. H. (2001). Nonlinear Dynamics and Chaos.
  18. Tomasello, M. (2009). Why We Cooperate.
  19. Vicsek, T. , Czirók, A., Ben-Jacob, E., Cohen, I., and Shochet, O. (1995). Novel type of phase transition in a system of self-driven particles. Physical Review Letters, 1226. [Google Scholar]
  20. Watts, D. J. and Strogatz, S. H. (1998). Collective dynamics of ‘small-world’ networks. Nature.
  21. Rafik, Z. , & Salas, H. A. (2024). Chaotic dynamics and zero distribution: implications and applications in control theory for Yitang Zhang’s Landau Siegel zero theorem. European Physical Journal Plus. [CrossRef]
  22. Strogatz, S. H. (2018). Nonlinear dynamics and chaos: With applications to physics, biology, chemistry, and engineering.
  23. Yu, S. , Cheng, Z., Yu, Y., & Wu, Z. (2021). Adaptive chaos synchronization of uncertain unified chaotic systems with input saturation. ( 104(2), 1555–1570.
  24. Rafik, Z. , Salas, A. H., & Souad, A. (2025). Chaotic dynamics derived from the Montgomery conjecture: Application to electrical systems. In Dynamical Systems - Latest Developments and Applications [Working Title]. IntechOpen. [CrossRef]
  25. Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal.
  26. Jaynes, E. T. (1957). Information theory and statistical mechanics. Physical Review.
  27. Tsallis, C. (1988). Possible generalization of Boltzmann–Gibbs statistics. Journal of Statistical Physics.
  28. Gell-Mann, M. , & Lloyd, S. (1996). Information measures, effective complexity, and total information. ( 2(1), 44–52.
  29. Cover, T. M. , & Thomas, J. A. (2006). Elements of Information Theory.
  30. Zurek, W. H. (2003). Decoherence, einselection, and the quantum origins of the classical. Reviews of Modern Physics.
  31. Box, G. E. P. and Jenkins, G. M. (1970). Time Series Analysis, Forecasting, and Control.
  32. Box, G. E. P. , Jenkins, G. M., and Reinsel, G. D. (1993). D. ( Englewood Cliffs, NJ.
  33. Chatfield, C. (1996). The Analysis of Time Series: An Introduction.
  34. Fuller, W. A. (1996). Introduction to Statistical Time Series.
  35. Gaynor, P. E. and Kirkpatrick, R. C. (1994). Introduction to Time-Series Modeling and Forecasting in Business and Economics.
  36. Shumway, R. H. and Stoffer, D. S. (2000). Time Series Analysis and Its Applications.
Figure 1. Bifurcation behavior of the deterministic inheritance system. We plot network size after 10 iterations, x 10 = k 10 , for 500 evenly spaced values of k [ 0.5 , 2.0 ] . The critical bifurcation point at k = 1 (red dashed line) separates exponential decay ( k < 1 ) from exponential growth ( k > 1 ).
Figure 1. Bifurcation behavior of the deterministic inheritance system. We plot network size after 10 iterations, x 10 = k 10 , for 500 evenly spaced values of k [ 0.5 , 2.0 ] . The critical bifurcation point at k = 1 (red dashed line) separates exponential decay ( k < 1 ) from exponential growth ( k > 1 ).
Preprints 170452 g001
Figure 2. Lyapunov exponent λ = log ( k ) for the discrete inheritance system x l + 1 = k x l , evaluated for 500 values of k [ 0.5 , 2.0 ] . The red dashed line at λ = 0 corresponds to the critical bifurcation point k = 1 , which separates contraction from expansion in network dynamics.
Figure 2. Lyapunov exponent λ = log ( k ) for the discrete inheritance system x l + 1 = k x l , evaluated for 500 values of k [ 0.5 , 2.0 ] . The red dashed line at λ = 0 corresponds to the critical bifurcation point k = 1 , which separates contraction from expansion in network dynamics.
Preprints 170452 g002
Figure 3. Average trajectory of the noisy inheritance model N ( l + 1 ) = k N ( l ) + η l over 100 iterations and 50 realizations for integer values k = 1 , 5 , 20 , 50 , 100 , 145 , with noise amplitude ϵ = 0.1 and initial value N ( 0 ) = 1 . For k = 1 , the system remains bounded and fluctuates due to stochastic perturbations. As k increases, the average trajectory rapidly diverges, indicating that deterministic inheritance dominates noise and drives unbounded growth.
Figure 3. Average trajectory of the noisy inheritance model N ( l + 1 ) = k N ( l ) + η l over 100 iterations and 50 realizations for integer values k = 1 , 5 , 20 , 50 , 100 , 145 , with noise amplitude ϵ = 0.1 and initial value N ( 0 ) = 1 . For k = 1 , the system remains bounded and fluctuates due to stochastic perturbations. As k increases, the average trajectory rapidly diverges, indicating that deterministic inheritance dominates noise and drives unbounded growth.
Preprints 170452 g003
Figure 4. Phase portraits for the noisy inheritance model N ( l + 1 ) = k N ( l ) + η l for increasing values of k, with fixed noise amplitude ϵ = 0.1 and initial condition N ( 0 ) = 1 . Each subplot shows the trajectory N ( l ) over time for 100 iterations. As k increases, trajectories rapidly diverge due to exponential amplification of inherited structure, overshadowing the additive noise.
Figure 4. Phase portraits for the noisy inheritance model N ( l + 1 ) = k N ( l ) + η l for increasing values of k, with fixed noise amplitude ϵ = 0.1 and initial condition N ( 0 ) = 1 . Each subplot shows the trajectory N ( l ) over time for 100 iterations. As k increases, trajectories rapidly diverge due to exponential amplification of inherited structure, overshadowing the additive noise.
Preprints 170452 g004
Figure 5. Bifurcation Diagram: Noisy Exponential Growth Model N ( l ) = k l + η l , where η l U ( ϵ , ϵ ) and ϵ = 0.1 . Simulated with 30,000 iterations, displaying the last 100 values of N ( l ) for each k [ 0.5 , 4.0 ] sampled over 500 inheritance parameter steps.
Figure 5. Bifurcation Diagram: Noisy Exponential Growth Model N ( l ) = k l + η l , where η l U ( ϵ , ϵ ) and ϵ = 0.1 . Simulated with 30,000 iterations, displaying the last 100 values of N ( l ) for each k [ 0.5 , 4.0 ] sampled over 500 inheritance parameter steps.
Preprints 170452 g005
Figure 6. Bifurcation Diagram: Noisy Exponential Growth Model N ( l ) = k l + η l , where η l U ( ϵ , ϵ ) and ϵ = 0.1 . Simulated with 30,000 iterations and plotting the last 100 values of N ( l ) for all k [ 0.05 , 0.9 ] using 500 inheritance parameter steps.
Figure 6. Bifurcation Diagram: Noisy Exponential Growth Model N ( l ) = k l + η l , where η l U ( ϵ , ϵ ) and ϵ = 0.1 . Simulated with 30,000 iterations and plotting the last 100 values of N ( l ) for all k [ 0.05 , 0.9 ] using 500 inheritance parameter steps.
Preprints 170452 g006
Figure 7. Shannon entropy of the normalized noisy inheritance trajectory as a function of inheritance factor k { 1 , 5 , 20 , 50 , 100 , 145 } , computed over 100 steps with noise amplitude ϵ = 0.1 . Entropy rises rapidly with increasing k, stabilizing around 5.62 bits once the system escapes the low-variability regime. The high-entropy plateau indicates maximized unpredictability driven by additive noise during rapid deterministic growth.
Figure 7. Shannon entropy of the normalized noisy inheritance trajectory as a function of inheritance factor k { 1 , 5 , 20 , 50 , 100 , 145 } , computed over 100 steps with noise amplitude ϵ = 0.1 . Entropy rises rapidly with increasing k, stabilizing around 5.62 bits once the system escapes the low-variability regime. The high-entropy plateau indicates maximized unpredictability driven by additive noise during rapid deterministic growth.
Preprints 170452 g007
Figure 10. Autocorrelation function (ACF) of the stochastic inheritance process with parameters k = 5 , ϵ = 0.1 , and initial value x 0 = 1 , simulated over L = 100 iterations. The ACF decays rapidly, reflecting weak temporal dependence due to exponential growth.
Figure 10. Autocorrelation function (ACF) of the stochastic inheritance process with parameters k = 5 , ϵ = 0.1 , and initial value x 0 = 1 , simulated over L = 100 iterations. The ACF decays rapidly, reflecting weak temporal dependence due to exponential growth.
Preprints 170452 g010
Figure 11. Time series showing 10 sample trajectories of the stochastic inheritance model with Gaussian noise ( k = 1.1 , ϵ = 0.1 ).
Figure 11. Time series showing 10 sample trajectories of the stochastic inheritance model with Gaussian noise ( k = 1.1 , ϵ = 0.1 ).
Preprints 170452 g011
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated