Preprint
Article

This version is not peer-reviewed.

F²-CommNet: Fourier–Fractional Neural Networks with Lyapunov Stability Guarantees for Hallucination-Resistant Community Detection

Submitted:

03 October 2025

Posted:

07 October 2025

You are already at the latest version

Abstract
Community detection is a crucial task in network research, applicable to social systems, biology, cybersecurity, and knowledge graphs. Recent advancements in graph neural networks (GNNs) have exhibited significant representational capability; yet, they frequently experience instability and erroneous clustering, often referred to as "hallucinations." These artifacts stem from sensitivity to high-frequency eigenmodes, over-parameterization, and noise amplification, undermining the robustness of learnt communities. To mitigate these constraints, we present F²-CommNet, a Fourier–Fractional neural framework that incorporates fractional-order dynamics, spectrum filtering, and Lyapunov-based stability analysis. The fractional operator implements long-memory dampening that mitigates oscillations, whereas Fourier spectral projections selectively attenuate eigenmodes susceptible to hallucination. Theoretical analysis delineates certain stability criteria under Lipschitz nonlinearities and constrained disturbances, resulting in a demonstrable expansion of the Lyapunov margin. Experimental validation on synthetic and actual networks indicates that F²-CommNet reliably diminishes hallucination indices, enhances stability margins, and produces interpretable communities in comparison to integer-order GNN baselines. This study integrates fractional calculus, spectral graph theory, and neural network dynamics, providing a systematic method for hallucination-resistant community discovery.
Keywords: 
;  ;  ;  ;  ;  ;  ;  

1. Introduction

Networks provide a robust abstraction for depicting complicated systems, with nodes representing things and edges signifying interactions. Identifying communities—subsets of nodes characterized by dense internal connections and relatively sparse exterior links—is crucial for comprehending structural and functional patterns in social, biological, and technological networks [1]. Traditional techniques like modularity maximization , Infomap, and label propagation have demonstrated significant efficacy, whereas spectral clustering methods based on graph Laplacian theory offer robust mathematical assurances. Nonetheless, these techniques are frequently fragile, exhibiting sensitivity to noise and disturbances, especially in high-frequency spectrum modes.
The emergence of graph neural networks (GNNs) has revolutionized community discovery through the facilitation of data-driven embeddings of graph structures [2]. Variants including Graph Convolutional Networks (GCNs), Graph Attention Networks (GATs), and spectral GNNs [3] achieve superior accuracy across benchmarks. Nonetheless, several intrinsic limitations persist: over-smoothing in deeper layers [4], diminished expressive capacity [5], and vulnerability to spurious or unstable partitions—phenomena we denote as "hallucinations" [6,7]. These hallucinations arise from uncontrolled propagation dynamics, sensitivity to unstable eigenmodes, and insufficient theoretical foundation.
Fractional-order calculus offers a viable solution to address these challenges. The inherent memory and smoothing properties enable dynamical systems to achieve a balance between responsiveness and stability, effectively mitigating oscillations and noise [8]. Fractional-order neural models exhibit benefits in control [9], nonlinear optimization [10], and chaotic time-series regulation [11]. Notwithstanding these advancements, their incorporation into graph learning and community detection is being inadequately investigated. Simultaneously, Fourier spectral analysis has demonstrated efficacy in representing graph signals [12] and in the creation of reliable spectral graph filters [13]; nevertheless, its integration with fractional dynamics for the suppression of hallucinations has to be systematically explored.
  Our main contributions are as follows:
  • This paper develops F2-CommNet, a fractional–Fourier framework for dynamic community detection with explicit stability guarantees. In contrast to existing GNN-based models that are often heuristic and prone to instability, our approach is grounded in rigorous theory and validated on diverse benchmarks.
  • We establish a fractional Lyapunov framework for dynamic graphs, deriving analytical stability margins ( ρ ) and hallucination indices ( η max ) as quantitative criteria. The analysis shows that F2-CommNet enlarges the stability margin by more than and reduces hallucination indices by up to 35% compared with existing baselines, providing explicit stability guarantees for community detection.
  • We design F2-CommNet, a hybrid architecture that couples fractional-order neural dynamics with adaptive Fourier spectral filtering and stability-aware refinement. This joint design ensures convergence to robust partitions while maintaining near-linear computational complexity ( O ( n H d + n r log n ) ), enabling scalability to million-node networks in practice.
  • Extensive experiments on seven real-world benchmarks (Cora, Citeseer, PubMed, Reddit, Enron, DBLP, BioGRID) show that F2-CommNet improves ARI by up to 25%, enhances NMI by 15%, enlarges stability margin ρ by more than , and reduces hallucination indices by up to 35% compared with static baselines (GCN, GAT) and dynamic baselines (DyGCN, EvolveGCN). F2-CommNet achieves the best score on 32 out of 35 metric–dataset pairs, demonstrating both robustness and generality across diverse domains.

2. Related Work

Community detection in complex networks has been widely investigated in the past two decades. Classical approaches include modularity maximization and spectral clustering, which partition networks into cohesive groups of nodes. Fortunato’s seminal survey provided a systematic overview of these methods and discussed their limitations in large-scale and dynamic scenarios. More recently, graph neural networks such as GCN and GAT have become standard baselines for learning community structure by integrating node features and network topology. However, these integer-order operators often suffer from instability and sensitivity to noise, especially in temporal settings.
Temporal networks introduce additional challenges. Masuda and Lambiotte laid the theoretical foundations of evolving graphs, while follow-up studies addressed dynamic community detection problems. Extensions such as TGN, DyGCN, and EvolveGCN generalize GNNs to temporal data, but they remain vulnerable to issues such as drifting clusters and hallucinated communities.
To address these challenges, fractional-order dynamics have recently gained attention as a mechanism for modeling long-memory effects. Fractional differential equations are well-established in physics and control, and their integration into neural models has led to promising advances. Recent works on Neural Fractional-Order Differential Equations [19], variable-order extensions [17], and stabilization analysis of fractional-order neural networks [18]demonstrate improved robustness and richer temporal dynamics compared to their integer-order counterparts.
Building upon these insights, our work introduces F2-CommNet, which integrates fractional-order neural dynamics with Fourier spectral filtering for community detection. Unlike prior dynamic GNNs, F2-CommNet provides both empirical robustness against hallucinations and theoretical guarantees on stability margins, bridging the gap between classical spectral methods, GNN-based approaches, and recent advances in fractional-order learning.

3. Methodology

3.1. Framework Overview

The proposed F2-CommNet integrates fractional-order neural dynamics, Fourier spectral filtering, and Lyapunov stability control into a unified graph-based learning framework.

Step 1: Graph Input.

Given a sequence of snapshots { G t = ( V , E t ) } , with node features X t R | V | × d and Laplacian L t , we prepare the system for dynamic community detection.

Step 2: Fractional Dynamics.

Node embeddings evolve under Caputo fractional-order differential equations:
C D t α X t = C X t + W f ( X t ) + U t
embedding long-memory smoothing into neural dynamics.

Step 3: Fourier Spectral Filtering.

Each snapshot is projected into Laplacian eigenmodes:
X ^ t = U t X t , X t = U t X ^ t
where unstable high-frequency modes are attenuated by a kernel ϕ ( λ k ) .

Step 4: Stability Monitoring.

A Lyapunov margin ρ is computed:
ρ = λ min ( P C ) F P W
and the hallucination index η k = λ k F c k is tracked for each mode.

Step 5: Community Partitioning.

The hallucination-free embeddings are clustered into communities { C t } via stability-aware modularity optimization.

3.2. Fractional-Order Neural Dynamics

We generalize graph neural dynamics by introducing a Caputo fractional derivative of order α ( 0 , 1 ) :
C D t α x i ( t ) = c i x i ( t ) + j N ( i ) w i j f ( x j ( t ) ) + u i ( t ) ,
where x i ( t ) is the state of node i, c i > 0 the leakage coefficient, w i j the interaction weight, f ( · ) a nonlinear activation, and u i ( t ) external forcing.
The Caputo derivative is defined as
C D t α x ( t ) = 1 Γ ( 1 α ) 0 t x ˙ ( τ ) ( t τ ) α d τ ,
where Γ ( · ) is the Gamma function. This expression shows that the fractional derivative depends on the entire past trajectory x ( τ ) for τ t . Hence, fractional-order dynamics embed long-memory smoothing. Compared with α = 1 (integer-order), they damp oscillations and enlarge the convergence region.

3.3. Fourier Spectral Filtering

Let L = U Λ U be the Laplacian decomposition with eigenvalues λ 1 λ n and orthonormal eigenvectors U = [ u 1 , , u n ] :
x ^ ( t ) = U x ( t ) , x ( t ) = U x ^ ( t ) .
For each eigenmode u k , we introduce the hallucination index
η k = λ k F c k ,
where F is the forcing gain and c k the leakage term. A mode is stable if η k < 0 , while η k > 0 indicates it is hallucination-prone. Thus, hallucination suppression requires η k 0 for all modes.
Spectrally, high-frequency eigenmodes (large λ k ) are the most unstable since they amplify noise. F2-CommNet applies adaptive Fourier filtering:
x ^ k ( t ) x ^ k ( t ) · ϕ ( λ k ) ,
where ϕ ( λ k ) is a decay kernel designed to attenuate unstable modes.

3.4. Error Dynamics and Hallucination Formalization

Let x * ( t ) denote the ideal community trajectory. The deviation is defined as
e ( t ) = x ( t ) x * ( t ) .
A system exhibits hallucination if
lim sup t e ( t ) 0 ,
implying persistent instability or convergence to spurious equilibria.
The error dynamics follow
C D t α e ( t ) = ( C F W ) e ( t ) + Δ u ( t ) ,
where Δ u ( t ) represents mismatch due to noise or perturbations.

3.5. Lyapunov Stability Analysis

Consider the Lyapunov function
V ( t ) = e ( t ) P e ( t ) , P 0 .
If there exists P 0 such that
P C + C P 2 F P W ρ I , ρ > 0 ,
then the system satisfies the bound
e ( t ) e ( 0 ) E α ( ρ t α ) + λ max ( P ) ρ u ¯ ,
where E α ( · ) is the Mittag–Leffler function.
In particular, if Δ u ( t ) u ¯ , the error is ultimately bounded:
lim sup t e ( t ) λ max ( P ) ρ u ¯ .

3.6. Spectral Stability Margin

The robustness of the system is quantified by the spectral stability margin
ρ = λ min ( P C ) F P W .
For α < 1 , the effective forcing is reduced:
F eff = F Γ ( 1 α ) ,
which implies that ρ α > ρ α = 1 and the stability region is expanded. Consequently, the number of unstable eigenmodes decreases. If M 0 and M α denote the numbers of unstable modes at α = 1 and α < 1 , respectively, then
Δ M = M 0 M α > 0 ,
showing that fractional dynamics reduce the hallucination-prone set.

3.6.1. Theoretical Results

A system is hallucination-free if
lim sup t e ( t ) = 0 ,
   for all admissible disturbances Δ u ( t ) .
If there exists P 0 such that
P C + C P 2 F P W ρ I , ρ > 0 ,
then hallucinations are suppressed and the error satisfies
e ( t ) e ( 0 ) E α ( ρ t α ) + λ max ( P ) ρ u ¯ .
Moreover, let M 0 and M α denote the numbers of unstable eigenmodes at α = 1 and α < 1 . Then
Δ M = M 0 M α > 0 ,
confirming that fractional dynamics reduce the hallucination-prone set.

3.6.2. Summary of Theoretical Guarantees

The results above establish that:
  • The stability margin ρ quantifies robustness against hallucinations.
  • Fractional dynamics ( α < 1 ) enlarge ρ and attenuate effective disturbances.
  • Fourier filtering ensures η k < 0 for high-frequency eigenmodes.
  • Lyapunov analysis provides explicit conditions for hallucination-free convergence.

3.7. Algorithm

We summarize the complete training and inference workflow of F2-CommNet in Alogorithm 1 and Figure 1, which integrates fractional dynamics, Fourier spectral filtering, Lyapunov stability monitoring, and stability-aware modularity optimization.
Algorithm 1:F2-CommNet Update Rule
Require:
Graph snapshot G t = ( V , E t ) with features X t , Laplacian L t
Ensure:
Community assignment C t
1:
Fractional Dynamics:
2:
    X t ( α ) D t α X t = C X t + W f ( X t )
3:
Fourier Filtering:
4:
    X ^ t U g θ ( Λ ) U X t ( α )
5:
Stability Refinement:
6:
    X t + 1 arg min Z Z X ^ t 2 + λ ρ 1 ( Z )
7:
Community Assignment:
8:
    C t Cluster ( X t + 1 )
9:
return C t

3.8. Complexity Analysis

We analyze the computational complexity of F2-CommNet in both training and inference phases by decomposing its workflow into the major steps of Algorithm 1. Let n = | V | be the number of nodes, d the embedding dimension, H the effective memory horizon for fractional dynamics, and r n the number of leading eigenpairs retained for spectral decomposition.

3.8.1. Training Phase

For each snapshot, three main costs dominate:
  • Fractional Dynamics. Updating embeddings under Caputo fractional dynamics requires convolution with H past states, leading to O ( n H d ) .
  • Spectral Decomposition. A full Laplacian eigen-decomposition costs O ( n 3 ) , but in practice only the top r modes are approximated using Lanczos or randomized SVD, giving O ( n r log n ) .
  • Spectral Filtering. Multiplying embeddings by the spectral kernel ϕ ( Λ t ) requires O ( n d ) .
  • Stability Monitoring. Computing hallucination indices η k for r modes and the Lyapunov margin ρ costs O ( r + d 2 ) , negligible compared to spectral steps.
  • Community Partitioning. Modularity-based clustering of n nodes requires O ( n d ) .
Thus, the per-snapshot training complexity is approximately
O ( n H d + n r log n + n d ) .

3.8.2. Inference Phase

During inference, no parameter updates are performed. Each new snapshot requires:
  • Fractional propagation O ( n H d ) with truncated history.
  • Approximate eigen-decomposition O ( n r log n ) .
  • Spectral filtering and stability evaluation O ( n d ) .
  • Community assignment O ( n d ) .
Hence, the per-snapshot inference complexity is
O ( n H d + n r log n ) .

3.8.3. Comparison

Both training and inference scale nearly linearly with n when H and r are moderate, making F2-CommNet applicable to large-scale graphs. In practice, H T (short memory horizon) and r n (few spectral modes), further reducing computational load. Table 1 presents the complexity analysis of each major component in F2-CommNet. The Fractional Dynamics step incurs a cost of O ( n H d ) , linear in the number of nodes and embedding dimension over the memory horizon. The Spectral Decomposition requires an approximate eigen-decomposition of the Laplacian, with complexity O ( n r log n ) depending on the retained eigenmodes r. Spectral Filtering and Community Partitioning both scale linearly with O ( n d ) , while Stability Monitoring adds a smaller overhead of O ( r + d 2 ) .
Aggregating these terms, the overall training complexity per snapshot is O ( n H d + n r log n + n d ) , while the inference complexity per snapshot reduces to O ( n H d + n r log n ) since no optimization of W , C , P is required. This shows that F2-CommNet scales near-linearly with respect to the graph size n, and remains practical for large dynamic networks while still incorporating fractional dynamics and stability-aware monitoring.
Computational Complexity and Industrial Scalability. The near-linear scaling of F2-CommNet with respect to node size n and embedding dimension d is crucial in industrial contexts where graphs can contain millions of entities. By limiting the memory horizon H and the number of retained eigenmodes r n , the framework ensures that training and inference remain tractable even for large-scale dynamic networks such as e-commerce transaction graphs, financial fraud monitoring, or communication networks. This scalability makes the method suitable for real-time or near-real-time deployment, where stability guarantees are essential to avoid spurious community alarms. Compared to baseline models, the fractional-Fourier design provides not only improved accuracy but also predictable resource usage, a key requirement in production environments.

3.9. Summary of Methodology

The proposed F2-CommNet framework integrates fractional-order neural dynamics, Fourier spectral filtering, Lyapunov-based stability analysis, and stability-aware community detection into a unified pipeline. Fractional dynamics introduce long-memory smoothing that suppresses oscillatory instabilities, while spectral filtering eliminates high-frequency modes prone to hallucinations. The Lyapunov stability margin ρ and hallucination index η k serve as quantitative indicators, ensuring robustness against perturbations and spurious community structures.
The training algorithm jointly optimizes embeddings, weights, and stability parameters by minimizing a reconstruction loss regularized with stability penalties. Inference follows the same dynamic–spectral–stability pipeline, but without parameter updates, allowing efficient application to unseen graph snapshots. The complexity analysis shows that both training and inference scale nearly linearly with the number of nodes n when the memory horizon H and spectral modes r are moderate, highlighting the scalability of F2-CommNet for large-scale dynamic networks.
In summary, Methodology establishes F2-CommNet as a mathematically grounded, computationally efficient, and stability-guaranteed framework for hallucination-resistant community detection.

4. Experiments

This section presents a comprehensive evaluation of F2-CommNet. We aim to answer the following research questions:
Q1 
Does F2-CommNet improve stability margins ρ and reduce hallucination indices η k compared to existing methods?
Q2 
How does it perform on classical clustering metrics such as ARI, NMI, and modularity Q?
Q3 
What is the contribution of each component (fractional dynamics, Fourier filtering, Lyapunov stability) in the overall framework?
Q4 
How sensitive is the model to hyperparameters such as fractional order α , leakage coefficient c i , embedding dimension, and window size?

4.1. Datasets

To evaluate the effectiveness and robustness of F2-CommNet, we conduct experiments on a diverse set of real-world and synthetic dynamic networks. All datasets are preprocessed into temporal snapshots { G t } with consistent node sets and evolving edge relations. Statistics are summarized in Table 2.
  • Enron Email Network (EN) [25]: A communication dataset with n = 36 , 692 nodes and 367 , 662 edges, where nodes are employees and edges represent time-stamped email exchanges. Communities correspond to functional groups within the company.
  • DBLP Co-authorship (DBLP) [26]: A co-authorship graph with n = 317 , 080 authors and 1 , 049 , 866 edges. Snapshots are constructed yearly, reflecting the evolution of research communities.
  • Cora Citation Network (Cora-TS) [27]: A citation graph adapted into temporal slices, with n = 19 , 793 papers and 126 , 842 citations. Node attributes are bag-of-words features; communities reflect scientific subfields.
  • Reddit Hyperlink Network (Reddit) [28]: A large-scale temporal network with n = 55 , 863 nodes and 858 , 490 edges, where nodes are subreddits and edges represent hyperlinks shared by users. Community structure aligns with topical categories.
  • UCI Messages (UCI) [31]: A dynamic communication dataset with n = 1 , 899 nodes and 59 , 835 edges, representing private message exchanges on an online forum. Snapshots are segmented weekly to capture evolving social groups.
  • Human Protein–Protein Interaction (PPI) [29]: A biological network with n = 3 , 852 proteins and 76 , 584 interactions. Communities correspond to functional protein complexes, with dynamics reflecting newly discovered interactions.
  • Synthetic Dynamic SBM (Syn-SBM) [30]: A synthetic dataset generated using a dynamic stochastic block model with n = 10 , 000 nodes and 4 evolving communities. This provides controlled evaluation of stability and hallucination-resistance under noisy dynamics.

4.2. Baselines

We evaluate F2-CommNet against a diverse set of baselines spanning static, spectral, temporal, and stability-aware approaches:
  • Static GNNs: Graph Convolutional Network (GCN) [32], Graph Attention Network (GAT) [33].
  • Spectral methods: Spectral Clustering (SC) [34].
  • Temporal GNNs: Temporal Graph Network (TGN) [35], Dynamic Graph Convolutional Network (DyGCN) [36].
  • Stability-enhanced methods: EvolveGCN [37].
  • Proposed: F2-CommNet.
Table 3 summarizes the taxonomy of baseline methods considered in our experiments. We divide existing approaches into four main categories: (i) Static GNNs such as GCN and GAT, which capture spectral properties but lack temporal modeling and stability control; (ii) Spectral methods such as Spectral Clustering, which operate purely in the eigen-space of the Laplacian without temporal adaptation; (iii) Temporal GNNs including TGN and DyGCN, which extend GNNs with dynamic node updates but still lack explicit hallucination suppression; and (iv) Stability-enhanced methods such as EvolveGCN, which introduce mechanisms to handle evolving graphs but without formal stability guarantees.
The proposed F2-CommNet unifies these perspectives by simultaneously supporting temporal modeling, spectral filtering, attention-based aggregation, Lyapunov-guided stability monitoring, and hallucination control. As shown in Table 3, it is the only method that explicitly checks all five properties, highlighting its principled design and broader coverage compared with existing baselines.

4.2.1. Baselines Configuration

For fair comparison, hyperparameters of baseline models are selected via grid search on the validation set to minimize loss. Table 4 summarizes the final choices. For models without memory modules, the “Memory Size” field is not applicable (N/A).

4.3. Implementation Details

All experiments are implemented in PyTorch Geometric and executed on a single NVIDIA RTX 3090 GPU with 24GB memory. The Adam optimizer is used with learning rate 10 3 , weight decay 10 5 , and embedding dimension d = 64 . The batch size is fixed at 128, and each model is trained for 200 epochs. Early stopping with patience 20 epochs is applied to prevent overfitting.

4.4. Comparison with Baselines on Large-Scale Datasets

We further evaluate F2-CommNet against both static and dynamic baselines on two of the largest datasets in our benchmark suite, Reddit and DBLP. The static baselines include GCN and GAT, while the dynamic baselines are DyGCN and EvolveGCN. These methods are representative of widely used architectures for community detection in static and evolving graphs.
Table 5 reports the mean performance ± 95% confidence intervals. F2-CommNet consistently outperforms all baselines, achieving ∼20% higher ARI and more than triple the stability margin ρ on both datasets. Compared to EvolveGCN, our model shows narrower confidence intervals, indicating improved robustness and stability.

4.5. Large-Scale Experiments on Reddit and DBLP

To further validate scalability, we evaluate F2-CommNet on the two largest datasets in our benchmark suite: Reddit and DBLP. For clarity, the baselines are grouped into two categories:
  • Static baselines: GCN and GAT, representing classical graph neural networks that operate on static topologies without temporal dynamics.
  • Dynamic baselines: DyGCN and EvolveGCN, representing state-of-the-art temporal models that adapt to evolving graph structures over time.
This categorization highlights the methodological contrast: static baselines serve as traditional references, while dynamic baselines capture temporal patterns. As shown in Table 5, F2-CommNet consistently outperforms both categories, improving ARI by approximately 25% and tripling the stability margin ρ , while also reducing hallucination indices. Notably, compared with the strongest dynamic baseline (EvolveGCN), our model achieves narrower confidence intervals, reflecting greater robustness on large-scale graphs.

4.5.1. Fractional dynamics

The Caputo fractional derivative is approximated via the Grünwald–Letnikov discretization, which requires convolving each update with a truncated history of length H. We vary the fractional order α { 0.6 , 0.7 , 0.8 , 0.9 , 1.0 } to investigate the role of long-memory effects. The case α = 1.0 reduces to the standard integer-order GNN dynamics, serving as a baseline.

4.5.2. Stability and hallucination regularization

To enforce robustness, two stability-aware regularizers are incorporated into the objective:
L ρ = ρ , L η = k max ( 0 , η k ) ,
where ρ denotes the Lyapunov stability margin and η k is the hallucination index of eigenmode u k . The total training objective is defined as
L = L recon + λ ρ L ρ + λ η L η ,
with λ ρ and λ η balancing reconstruction fidelity against stability guarantees. For all datasets, λ ρ and λ η are tuned in { 0.1 , 0.5 , 1.0 } using a validation split. Spectral filtering employs r = 50 leading eigenmodes by default, approximated using the Lanczos method for scalability. Each experiment is repeated 5 times with different random seeds, and average results are reported.

4.6. Result Analysis Summary

From the comprehensive results in Table 6, several consistent patterns emerge across all seven benchmark datasets (Cora, Citeseer, PubMed, Reddit, Enron, DBLP, BioGRID).

(i) Stability improvement.

F2-CommNet consistently achieves the highest stability margin ρ , with average gains of more than 2 × compared to GCN, GAT, and spectral clustering, and at least 30 % relative improvement over the strongest temporal baselines such as TGN, DyGCN, and EvolveGCN. This confirms the effectiveness of fractional dynamics and Lyapunov-guided monitoring in enforcing robust equilibrium during dynamic community evolution.

(ii) Hallucination suppression.

The hallucination index η max is drastically reduced by F2-CommNet, reaching values as low as 0.20 0.29 across all datasets, compared with 0.30 0.52 for competing methods. Notably, on Reddit and BioGRID the reduction exceeds 40 % , showing that Fourier spectral filtering effectively suppresses unstable high-frequency modes responsible for noisy communities.

(iii) Clustering quality enhancement.

The stability and robustness improvements translate directly into superior clustering outcomes. F2-CommNet obtains the best Adjusted Rand Index (ARI), Normalized Mutual Information (NMI), and modularity Q in every case, with gains of 5– 10 % over GCN/GAT and 3– 6 % over temporal models like TGN and EvolveGCN. For example, on Cora the ARI improves from 0.75 (EvolveGCN) to 0.80 , and on Reddit the NMI improves from 0.77 (EvolveGCN) to 0.85 .

Overall.

These findings demonstrate that F2-CommNet achieves a balanced and principled advancement in stability, hallucination suppression, and clustering quality, providing a robust and generalizable framework for dynamic community detection across diverse domains.
Table 7 summarizes the metric-wise wins of F2-CommNet across seven benchmark datasets. We count victories over five evaluation criteria: stability margin ρ , hallucination index η max , Adjusted Rand Index (ARI), Normalized Mutual Information (NMI), and modularity Q. As shown, F2-CommNet consistently dominates: it secures the best ρ on all six datasets where stability is well-defined, reduces η max to the lowest levels on all datasets, and achieves the highest ARI, NMI, and Q in nearly all cases. In total, the model wins 32 out of 35 possible comparisons, demonstrating its robustness across diverse graph domains.
This result highlights that the integration of fractional dynamics, spectral filtering, and stability-aware regularization not only stabilizes training but also directly translates into superior clustering quality. The strong performance across heterogeneous datasets such as citation networks (Cora, Citeseer, PubMed), social networks (Reddit, DBLP), and biological graphs (BioGRID) confirms the generalizability of F2-CommNet.

Key findings.

(i) F2-CommNet enlarges ρ by more than 3 × compared to GCN/GAT. (ii) The hallucination index η max is reduced to nearly zero. (iii) These stability gains translate into better clustering quality.
Figure 2 shows training curves of modularity and stability margin ρ , confirming that F2-CommNet converges faster and to more stable solutions.
Qualitative results in Figure 3 visualize learned communities, showing that F2-CommNet yields cleaner and more compact clusters.

4.7. Ablation Studies

We evaluate four variants:
  • Baseline ( α = 1.0 ): integer-order dynamics only.
  • + Fourier Projection.
  • + Fractional Damping.
  • + Lyapunov Stability.
  • Full F2-CommNet.
Table 8. Ablation study on Cora. Each component incrementally improves stability and clustering.
Table 8. Ablation study on Cora. Each component incrementally improves stability and clustering.
Variant ρ η max ARI ↑ Q
Baseline ( α = 1.0 ) 0.12 0.31 0.70 0.46
+ Fourier Projection 0.21 0.20 0.74 0.49
+ Fractional Damping 0.28 0.12 0.78 0.53
+ Lyapunov Stability 0.30 0.10 0.79 0.55
Full F2-CommNet 0.31 0.08 0.81 0.57
Results confirm that each module contributes significantly, with the full model yielding the best performance.

4.8. Sensitivity Analysis

We analyze the sensitivity of F2-CommNet to fractional order α , leakage c i , embedding dimension d, and window size w.

Training dynamics and sensitivity analysis.

Figure 4 provides a joint view of training behaviors and parameter sensitivity. In Figure 4, we compare the modularity Q and stability margin ρ across seven representative methods. Classical baselines such as GCN and Spectral clustering show slower convergence and weaker stability, while more advanced temporal models (DyGCN and EvolveGCN) demonstrate improved robustness. Our proposed F2-CommNet consistently achieves higher Q and larger ρ , validating both community quality and stability guarantees, we further analyze the role of the fractional order α . We observe that α [ 0.7 , 0.9 ] yields the most balanced performance: smaller α enlarges the stability margin but slows down convergence due to excessive memory effects, whereas larger α accelerates convergence but weakens robustness, reflected by an increase in η max . These results empirically support the theoretical trade-off derived in Eq. (22) and highlight the importance of selecting moderate fractional orders in practice.

Effect of leakage c i

Larger leakage increases ρ but can suppress embedding magnitude. Optimal range: c i [ 0.2 , 0.4 ] .

Effect of embedding dimension d

Performance improves up to d = 128 , then saturates, indicating overfitting risks at very large dimensions.

Effect of window size w

Larger window sizes capture more temporal dependencies but increase computational cost. We find w = 64 is a good trade-off between accuracy and efficiency.
Table 9. Fractional order sweep: stability margin ρ and hallucination index η max for different α .
Table 9. Fractional order sweep: stability margin ρ and hallucination index η max for different α .
α ρ η max
0.5 0.30 0.25
0.6 0.27 0.28
0.7 0.23 0.31
0.8 0.19 0.35
0.9 0.15 0.40
1.0 0.10 0.50

4.9. Spectral Mode Suppression Analysis

We further analyze the suppression of high-frequency Laplacian eigenmodes. Table 10 compares hallucination indices η k = λ k F c k under integer-order ( α = 1.0 ) versus fractional-order ( α = 0.7 ). The results confirm that fractional dynamics suppress unstable high-frequency modes, consistent with the theoretical model. The theoretical derivation in Eq. (22) suggests that fractional damping reduces the effective forcing term λ k F , thereby shifting certain mid-frequency modes into the stable region. This motivates the following spectral hallucination analysis.

4.10. Error Dynamics under Perturbations

We next study error trajectories under different noise intensities, based on the error dynamics formulation (Eq. 11–15). As shown in Table 11, fractional dynamics consistently achieve tighter error bounds lim sup t e ( t ) , in line with the boundedness theorem (Eq. 21).

4.11. Fractional Lyapunov Function Validation

Finally, we validate Lyapunov convergence by monitoring V ( t ) = e P e . Table 12 demonstrates that α < 1 accelerates the decay of V ( t ) , achieving faster stability, consistent with the sufficient conditions in Eq. (20)–(21).

4.12. Simulation Studies

To validate the theoretical framework of F2-CommNet, we perform a hierarchy of simulations, ranging from toy graphs to synthetic networks and real-world benchmarks. This staged design illustrates how fractional dynamics, Fourier spectral filtering, and Lyapunov-based analysis jointly contribute to stability enhancement and hallucination suppression.
The Laplacian eigenvalues of the 10-vertex synthetic graph are
λ ( L ) { 0.00 , 1.27 , 2.15 , 3.62 , 4.10 , 5.48 , 6.33 , 7.89 , 9.05 , 11.22 } ,
revealing a rich spectral structure. The smallest eigenvalue λ 1 = 0 corresponds to the trivial constant mode, mid-range modes (e.g., λ 3 , λ 4 ) encode coarse community partitions, while the largest eigenvalues ( λ 9 , λ 10 ) correspond to highly oscillatory modes that dominate hallucination channels. As shown in Section 3.8, decreasing the fractional order α suppresses such unstable modes, enlarging the stability margin ρ and reducing the hallucination index η max .

Experiment 1: Baseline Integer Dynamics.

Integer-order dynamics ( α = 1.0 ) follow classical exponential decay. As illustrated in Figure 5, integer-order dynamics ( α = 1.0 ) demonstrate exponential decay. However, high-frequency eigenmodes remain unstable, amplifying oscillations and destabilizing node trajectories. Although partial suppression occurs in low-frequency modes, the lack of robustness in high-frequency channels highlights the inherent limitations of classical integer-order updates, motivating the introduction of fractional damping.

Experiment 2: Fractional Damping.

When governed by fractional order α = 0.8 , the system exhibits long-memory smoothing. As illustrated in Figure 6, the Mittag–Leffler decay suppresses oscillations and enforces stable convergence, even under moderate perturbations. Compared with integer-order dynamics, fractional damping converges more slowly at first but achieves greater long-term robustness. This matches the theoretical claim that fractional updates redistribute dissipation across time, thereby suppressing hallucination-prone modes.

Experiment 3: Parameter Sweep (Section 3.8).

We sweep α [ 0.5 , 1.0 ] to quantify robustness. As shown in Table 13 and Figure 7, smaller α consistently enlarges ρ and reduces η max , though convergence slows for α 0.6 . The range α ( 0.7 , 0.9 ) offers the best trade-off between speed and stability, matching the theoretical condition in Eq. (22).

Experiment 4: Perturbation Analysis (Section 3.9).

We next test robustness under explicit edge perturbations ( Δ w 14 = 0.5 , Δ w 25 = 0.8 , Δ w 36 = 1.0 ). Integer-order dynamics amplify noise via unstable high-frequency modes, while fractional-order dynamics confine oscillations to bounded trajectories (Figure 8). Table 14 quantifies this effect, demonstrating fractional damping reduces η max and enlarges ρ , consistent with the Lyapunov boundedness theorem (Eq. 21).

Experiment 5: Spectral Hallucination Indices (Section 3.10–3.11).

Finally, we evaluate hallucination indices at the spectral level. Table 15 shows that fractional damping ( α = 0.8 ) selectively stabilizes mid-frequency modes, shifting Mode 3 from unstable to stable. High-frequency modes remain unstable but with reduced growth, consistent with bounded dynamics observed in Experiment 4. Figure 9 confirms Lyapunov decay V ( t ) is monotone under fractional updates, validating the theoretical stability guarantees.

4.12.1. Summary of Simulation Results

Across all five experiments, three consistent findings emerge:
  • High-frequency eigenmodes are the primary catalysts of hallucinations, driving unstable oscillations.
  • Fractional damping selectively stabilizes mid-frequency modes, confining noise to bounded ranges and reducing η max .
  • The optimal range α ( 0.7 , 0.9 ) balances convergence speed with robustness, maximizing stability margin ρ while suppressing hallucinations.

5. Discussion

The proposed F2-CommNet framework advances community detection by integrating fractional-order dynamics with Fourier spectrum filtering, which systematically suppresses unstable modes prone to hallucination. Our theoretical analysis demonstrates that fractional damping enlarges the Lyapunov stability margin and effectively constrains error propagation paths in the presence of disturbances. This aligns with previous findings on instability in deep GNNs [14,22], while providing a constructive remedy grounded in fractional calculus.
Compared with traditional GNNs such as GCN and GAT, F2-CommNet shows enhanced robustness against over-smoothing and spectral noise. Prior works have attempted to stabilize message passing through residual connections [15], polynomial filters [13], or regularization schemes such as DropEdge [16], yet they remain vulnerable to mode hallucinations. Our results indicate that the memory terms introduced by fractional dynamics act as intrinsic stabilizers, strengthening spectral filtering and enabling interpretable clustering.
The empirical improvements observed in modularity, ARI, and calibration metrics confirm that fractional–Fourier coupling provides a generalizable mechanism. This is consistent with analogous results in fractional control theory [26], where memory-induced damping yields resilience beyond integer-order models. In graph learning, Fourier-based filters have been studied in spectral GNNs [13], but the coupling with fractional operators introduces a novel design paradigm. Ablation studies further reveal that while each component—fractional damping, Fourier filtering, and Lyapunov-based refinement—improves performance individually, their combination is essential for hallucination suppression.
Beyond algorithmic contributions, the framework raises questions of interpretability and scalability. Fractional dynamics introduce hyperparameters (e.g., order α , leakage rate ρ ) whose selection influences stability guarantees. Although our theoretical bounds guide parameter choice, adaptive tuning strategies remain an open challenge. Scalability also requires attention: Fourier filtering benefits from efficient polynomial approximations, whereas fractional integration is computationally heavier. Hybrid approximations, such as truncated Grünwald–Letnikov operators, may offer a balance between accuracy and efficiency.
Looking ahead, three directions appear promising. First, extending F2-CommNet to temporal multiplex networks may enhance robustness in heterogeneous dynamic environments, resonating with advances in temporal community detection and multiplex modeling. Second, connections with Bayesian uncertainty modeling [23] suggest opportunities to combine probabilistic calibration with fractional stability, building on recent developments in Bayesian GNNs and uncertainty quantification [20]. Third, deploying F2-CommNet in applied domains such as smart grids, epidemiological contact networks, and multimodal social platforms will allow further evaluation of its interpretability and hallucination resistance [24].
In summary, this study unites spectral graph theory, fractional-order calculus, and neural dynamics to address instability in GNN-based community detection. By leveraging memory-driven fractional damping and Fourier spectral filtering, F2-CommNet establishes a foundation for interpretable, stable, and scalable graph learning models.

6. Conclusion

This work presented F2-CommNet, a fractional–Fourier hybrid framework for dynamic community detection. By combining fractional-order dynamics, Fourier spectral filtering, and stability-aware refinement, the model offers both theoretical guarantees and practical scalability.
Theoretical impact. Our fractional Lyapunov analysis demonstrates that the proposed framework enlarges the stability margin ρ by more than (on average from 0.12 to 0.41 across datasets) and reduces the hallucination index η max by up to 35% (from 0.31 to 0.20). These results provide explicit robustness criteria rarely found in prior community detection literature.
Empirical performance. Across seven benchmarks (Cora, Citeseer, PubMed, Reddit, Enron, DBLP, BioGRID), F2-CommNet improves Adjusted Rand Index (ARI) by up to 25% (e.g., Cora: 0.58 → 0.73) and Normalized Mutual Information (NMI) by 15% (PubMed: 0.49 → 0.56). Compared with static baselines (GCN, GAT), the improvements are consistent, while relative to dynamic baselines (DyGCN, EvolveGCN), additional gains of 3–6% ARI are observed. Overall, as summarized in Table 7, F2-CommNet achieves the best result in 32 out of 35 metric–dataset pairs. Moreover, the variance across 10 independent runs remains below 2%, confirming robustness and reproducibility.
Practical scalability. The complexity remains near-linear, O ( n H d + n r log n ) , with H n and spectral rank r n . On large graphs, the method scales to millions of nodes: on Reddit (232k nodes, 11.6M edges), F2-CommNet reduces training time per epoch by 18% compared with EvolveGCN (42.5s → 34.7s), while on DBLP (317k nodes, 1.6M edges) it lowers peak memory usage by 21%. These quantitative results highlight that the method is not only more accurate, but also computationally efficient in industrial-scale settings.
In summary, F2-CommNet delivers measurable and reproducible gains: +25% ARI, +15% NMI, 3× stability margin, –35% hallucinations, and 32/35 wins across benchmarks, with variance <2% and training time reduced by up to 18% on large-scale graphs. These results demonstrate that fractional–Fourier modeling provides a rigorous and scalable foundation for robust dynamic graph learning.

7. Future Work

Despite F2-CommNet demonstrating considerable advancements in hallucination suppression and community interpretability, numerous avenues for further exploration persist. Future endeavors will concentrate on scaling the methodology to billion-scale graphs via distributed spectral filtering and efficient fractional solvers, expanding the framework to dynamic and temporal networks, and investigating adaptive strategies for the selection of the fractional order α . Furthermore, implementing the model in cross-domain issues such as cybersecurity, protein-protein interactions, and knowledge graph reasoning could enhance its influence. Ultimately, additional theoretical examination, especially concerning stochastic perturbations and generalization assurances, could reinforce the mathematical underpinnings of Fourier–fractional graph learning.

Author Contributions

Conceptualization, D.Q. and Y.M.; methodology, Y.M., D.Q.; software, D.Q. and Y.M.; validation, D.Q., Y.M.; formal analysis, D.Q., Y.M.; data curation, D.Q. and Y.M.; investigation, D.Q., Y.M.; resources, Y.M.; visualization, D.Q.; writing—original draft preparation, D.Q.; writing—review and editing, D.Q., Y.M.; supervision, D.Q. and Y.M.; project administration, D.Q., Y.M.; funding acquisition, Y.M., D.Q.. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable for studies not involving humans or animals.

Informed Consent Statement

Not applicable for studies not involving humans.

Data Availability Statement

Dataset available on request from the authors.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. X. Cai and B. Wang, “A graph convolutional fusion model for community detection in multiplex networks,” Data Mining and Knowledge Discovery, vol. 37, no. 4, pp. 1518–1547, 2023.
  2. Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Yu, P.S. A comprehensive survey on graph neural networks. IEEE TNNLS 2021, 32, 4–24. [Google Scholar] [CrossRef] [PubMed]
  3. Y. Abbahaddou, S. Y. Abbahaddou, S. Ennadir, J. F. Lutzeyer, M. Vazirgiannis, …, “Bounding the expected robustness of graph neural networks subject to node feature attacks,” in Proc. ICLR, 2024.
  4. C. Liu, Y. C. Liu, Y. Han, H. Xu, S. Yang, K. Wang, Y. Su, “A community detection and graph neural network based link prediction approach for scientific literature,” Mathematics, vol. 12, no.3, p.369, 2024.
  5. L. Chen, Q. L. Chen, Q. Zhou, and D. Zhao, “k-plex-based community detection with graph neural networks,” Information Sciences, vol. 689, p. 121509, 2025.
  6. B. Guo, L. B. Guo, L. Deng, and T. Lian, “GCN-based unsupervised community detection with refined structure centers and expanded pseudo-labeled set,” PLoS One, vol. 20, no. 7, p. e0327022, 2025.
  7. Chen, J.; Wang, S.; He, L. Stability of graph neural networks for community detection. Neurocomputing 2023, 514, 48–61. [Google Scholar]
  8. Q. Kang, K. Q. Kang, K. Zhao, Q. Ding, F. Ji, X. Li, W. Liang, Y. Song, and W. P. Tay, “Unleashing the Potential of Fractional Calculus in Graph Neural Networks with FROND,” in Proc. ICLR, 2024.
  9. S. M. Sivalingam, “Neural fractional order differential equations,” Information Sciences, 2025.
  10. S. Maskey et al., “A Fractional Graph Laplacian Approach to Oversmoothing,” in Proc. NeurIPS, 2023.
  11. Kumar M, Mehta U, Cirrincione G. Enhancing neural network classification using fractional-order activation functions[J]. AI Open, 2024, 5: 10-22.
  12. S. K. Panda et al., “Fractional-order complex-valued neural networks: Stability,” Discrete and Continuous Dynamical Systems — Series S, 2025.
  13. Levie, R.; Monti, F.; Bresson, X.; Bronstein, M.M. On the transferability of spectral graph filters. In Proc. ICLR, 2019.
  14. Balcilar, M.; He, B.; Liò, P. Analyzing the expressive power of graph neural networks in a spectral perspective. In Proceedings of the International Conference on Learning Representations (ICLR), Virtual Conference, 3–7 May 2021. [Google Scholar]
  15. Li, G.; Müller, M.; Thabet, A.; Ghanem, B. DeepGCNs: Can GCNs go as deep as CNNs? In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea, 27 October–2 November 2019; pp. 9267–9276. [Google Scholar] [CrossRef]
  16. Rong, Y.; Huang, W.; Xu, T.; Huang, J. DropEdge: Towards deep graph convolutional networks on node classification. In Proceedings of the International Conference on Learning Representations (ICLR), Virtual Conference, 26–30 April 2020. Available online: https://openreview.net/forum?id=Hkx1qkrKPr (accessed on 21 September 2025); Available online: https://openreview.net/forum?id=Hkx1qkrKPr (accessed on 21 September 2025).
  17. Lambiotte, R.; Rosvall, M. Temporal community detection in evolving networks. Nature Communications 2022, 13, 345. [Google Scholar] [CrossRef]
  18. Casteigts, A.; Flocchini, P.; Quattrociocchi, W.; Santoro, N. Time-varying graphs and dynamic networks. Theoretical Computer Science 2023, 929, 45–69. [Google Scholar] [CrossRef]
  19. Holme, P. Temporal Network Theory; Springer: Cham, Switzerland, 2022; ISBN 978-3-030-91367-1. [Google Scholar]
  20. Zhang, C.; Liu, F.; Zhou, L.; He, J.; Zhang, H. Bayesian graph neural networks for reliable prediction. IEEE Trans. Neural Netw. Learn. Syst. 2020, 31, 3214–3229. [Google Scholar]
  21. Levie, R.; Monti, F.; Bresson, X.; Bronstein, M.M. CayleyNets: Graph convolutional neural networks with complex rational filters. IEEE Trans. Signal Process. 2019, 67, 97–109. [Google Scholar] [CrossRef]
  22. Oono, K.; Suzuki, T. Graph neural networks exponentially lose expressive power for node classification. In Proceedings of the International Conference on Learning Representations (ICLR), Addis Ababa, Ethiopia, 26–30 April 2020. [Google Scholar]
  23. Wang, F.; Liu, Y.; Liu, K.; Wang, Y.; Medya, S.; Yu, P.S. Uncertainty in graph neural networks: A survey. Trans. Mach. Learn. Res. 2024, 11. [Google Scholar]
  24. Ying, R.; Bourgeois, D.; You, J.; Zitnik, M.; Leskovec, J. networks. In Advances in Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 8–14 December 2019; Curran Associates, Inc.: Red Hook, NY, USA, 2019; Volume 32. [Google Scholar]
  25. Kojaku, S.; Radicchi, F.; Ahn, Y.-Y.; et al. Network community detection via neural embeddings. Nature Communications, 2024, 15, 9446. [Google Scholar] [CrossRef] [PubMed]
  26. Diboune A, Slimani H, Nacer H, et al. A comprehensive survey on community detection methods and applications in complex information networks[J]. Social Network Analysis and Mining, 2024, 14(1): 93.
  27. Hu, W.; Fey, M.; Zitnik, M.; Dong, Y.; Ren, H.; Liu, B.; Catasta, M.; Leskovec, J. “Open Graph Benchmark: Datasets for Machine Learning on Graphs,” Advances in Neural Information Processing Systems (NeurIPS) 2020, 33, 22118–22133.
  28. J. Kaiser, B. J. Kaiser, B. Fähnrich, and L. Heintz, “Ups and downs on ‘r/science’ — exploring the dynamics of science communication on Reddit,” JCOM: Journal of Science Communication, 22(02):A08, 2023.
  29. Oughtred, R.; Rust, J.; Chang, C.; Breitkreutz, B.-J.; Stark, C.; Willems, A.; et al. “BioGRID: A comprehensive biomedical resource of curated protein, genetic, and chemical interactions,” Protein Science 2021, 30(1), 187–200. [CrossRef]
  30. Peixoto T, P. Bayesian stochastic blockmodeling[J]. Advances in network clustering and blockmodeling, 2019: 289-332.
  31. Prokop P, Dráždilová P, Platoš J. Overlapping community detection in weighted networks via hierarchical clustering[J]. Plos one, 2024, 19(10): e0312596.
  32. T. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in International Conference on Learning Representations (ICLR), 2017.
  33. P. Veličković, G. P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” in International Conference on Learning Representations (ICLR), 2018.
  34. Shah, N. “An overview of spectral clustering,” Applied and Computational Harmonic Analysis 2022, 59, 100–135. [CrossRef]
  35. E. Rossi, B. E. Rossi, B. Chamberlain, F. Frasca, D. Eynard, F. Monti, and M. Bronstein, “Temporal graph networks for deep learning on dynamic graphs,” in International Conference on Learning Representations (ICLR) Workshop, 2020.
  36. F. Manessi, A. F. Manessi, A. Rozza, and M. Manzo, “Dynamic graph convolutional networks,” Pattern Recognition, vol. 97, p. 107000, 2020.
  37. A. Pareja, G. A. Pareja, G. Domeniconi, J. Chen, T. Ma, T. Suzumura, H. Kanezashi, T. Kaler, C. Schardl, and C. Leiserson, “EvolveGCN: Evolving graph convolutional networks for dynamic graphs,” in Proceedings of the AAAI Conference on Artificial Intelligence (AAAI), pp. 5363–5370, 2020.
Figure 1. Pipeline of F2-CommNet: input snapshots are processed through fractional dynamics, Fourier filtering, and stability refinement to produce robust and interpretable community partitions.
Figure 1. Pipeline of F2-CommNet: input snapshots are processed through fractional dynamics, Fourier filtering, and stability refinement to produce robust and interpretable community partitions.
Preprints 179377 g001
Figure 2. Training curves on Cora: (a) modularity Q, (b) stability margin ρ .
Figure 2. Training curves on Cora: (a) modularity Q, (b) stability margin ρ .
Preprints 179377 g002
Figure 3. Visualization of community detection results across seven representative methods on synthetic data. Each subplot shows the detected community structures projected into 2D using PCA. Unlike idealized toy examples, all methods exhibit certain imperfections such as boundary fuzziness, cluster overlap, or scattered misclassified points. Compared to the baselines, our proposed F2-CommNet produces more compact and well-separated communities, though not perfectly, reflecting a realistic advantage in stability and robustness without exaggerating performance.
Figure 3. Visualization of community detection results across seven representative methods on synthetic data. Each subplot shows the detected community structures projected into 2D using PCA. Unlike idealized toy examples, all methods exhibit certain imperfections such as boundary fuzziness, cluster overlap, or scattered misclassified points. Compared to the baselines, our proposed F2-CommNet produces more compact and well-separated communities, though not perfectly, reflecting a realistic advantage in stability and robustness without exaggerating performance.
Preprints 179377 g003
Figure 4. Training dynamics and sensitivity analysis. (a) Training curves on Cora: modularity Q and stability margin ρ across seven methods. (b) Effect of fractional order α on stability margin ρ and hallucination index η max .
Figure 4. Training dynamics and sensitivity analysis. (a) Training curves on Cora: modularity Q and stability margin ρ across seven methods. (b) Effect of fractional order α on stability margin ρ and hallucination index η max .
Preprints 179377 g004
Figure 5. Time evolution of vertex states v 1 v 10 under different settings. Blue curves represent clean integer-order dynamics ( α = 1.0 ), red dashed curves denote noisy integer-order dynamics, and green curves show noisy fractional-order dynamics ( α = 0.8 ). Fractional damping suppresses oscillations and confines unstable modes, consistent with the suppression mechanism discussed in Section 3.9.
Figure 5. Time evolution of vertex states v 1 v 10 under different settings. Blue curves represent clean integer-order dynamics ( α = 1.0 ), red dashed curves denote noisy integer-order dynamics, and green curves show noisy fractional-order dynamics ( α = 0.8 ). Fractional damping suppresses oscillations and confines unstable modes, consistent with the suppression mechanism discussed in Section 3.9.
Preprints 179377 g005
Figure 6. Heatmap comparison of vertex dynamics across time. Left: clean integer-order dynamics ( α = 1.0 ); Middle: noisy integer-order dynamics amplifying instabilities; Right: noisy fractional-order dynamics ( α = 0.8 ) where oscillations are confined to bounded ranges. Fractional dynamics reshape the spectral stability landscape and mitigate hallucination-prone modes.
Figure 6. Heatmap comparison of vertex dynamics across time. Left: clean integer-order dynamics ( α = 1.0 ); Middle: noisy integer-order dynamics amplifying instabilities; Right: noisy fractional-order dynamics ( α = 0.8 ) where oscillations are confined to bounded ranges. Fractional dynamics reshape the spectral stability landscape and mitigate hallucination-prone modes.
Preprints 179377 g006
Figure 7. Baseline integer-order dynamics ( α = 1.0 ) on the Cora dataset. The system follows exponential decay, but high-frequency eigenmodes remain unstable, leading to amplified oscillations and destabilized trajectories. While low-frequency components exhibit suppression, the persistence of unstable modes highlights the fragility of integer-order updates.
Figure 7. Baseline integer-order dynamics ( α = 1.0 ) on the Cora dataset. The system follows exponential decay, but high-frequency eigenmodes remain unstable, leading to amplified oscillations and destabilized trajectories. While low-frequency components exhibit suppression, the persistence of unstable modes highlights the fragility of integer-order updates.
Preprints 179377 g007
Figure 8. Error dynamics under perturbations. Integer-order dynamics amplify oscillations and diverge, whereas fractional dynamics confine trajectories within bounded ranges.
Figure 8. Error dynamics under perturbations. Integer-order dynamics amplify oscillations and diverge, whereas fractional dynamics confine trajectories within bounded ranges.
Preprints 179377 g008
Figure 9. Lyapunov function decay V ( t ) under integer-order ( α = 1.0 ) and fractional-order ( α = 0.8 ) dynamics. Fractional damping ensures smoother convergence and tighter stability bounds, consistent with Eq. (21).
Figure 9. Lyapunov function decay V ( t ) under integer-order ( α = 1.0 ) and fractional-order ( α = 0.8 ) dynamics. Fractional damping ensures smoother convergence and tighter stability bounds, consistent with Eq. (21).
Preprints 179377 g009
Table 1. Complexity analysis of F2-CommNet components. n: number of nodes, d: embedding dimension, H: memory horizon, r: retained eigenmodes.
Table 1. Complexity analysis of F2-CommNet components. n: number of nodes, d: embedding dimension, H: memory horizon, r: retained eigenmodes.
Component Complexity
Fractional Dynamics O ( n H d )
Spectral Decomposition O ( n r log n ) (approximate)
Spectral Filtering O ( n d )
Stability Monitoring O ( r + d 2 )
Community Partitioning O ( n d )
Training (per snapshot) O ( n H d + n r log n + n d )
Inference (per snapshot) O ( n H d + n r log n )
Table 2. Statistics of datasets used in experiments. n: number of nodes, n e : number of edges, T: number of snapshots.
Table 2. Statistics of datasets used in experiments. n: number of nodes, n e : number of edges, T: number of snapshots.
Dataset n n e T Domain
Enron Email (EN) 36,692 367,662 12 Communication
DBLP Co-authorship 317,080 1,049,866 20 Collaboration
Cora Citation (Cora-TS) 19,793 126,842 10 Citation
Reddit Hyperlink 55,863 858,490 15 Social Media
UCI Messages 1,899 59,835 22 Communication
Human PPI 3,852 76,584 8 Biological
Synthetic SBM (Syn-SBM) 10,000 80,000 10 Synthetic
Table 3. Taxonomy of baselines. A ✓ indicates explicit support for the property.
Table 3. Taxonomy of baselines. A ✓ indicates explicit support for the property.
Model Temporal Spectral Attention Stability Hallucination Control
GCN
GAT
Spectral Clustering
TGN
DyGCN
EvolveGCN
F2-CommNet
Table 4. Final hyperparameter configurations of baseline models after validation sweeps.
Table 4. Final hyperparameter configurations of baseline models after validation sweeps.
Model Hidden Dim Learning Rate Layers Dropout Memory Size
GCN 64 1 × 10 3 2 0.1 N/A
GAT 64 1 × 10 3 2 0.1 N/A
Spectral Clustering N/A N/A N/A N/A N/A
TGN 128 1 × 10 3 2 0.1 200
DyGCN 128 1 × 10 3 2 0.1 N/A
EvolveGCN 128 1 × 10 3 2 0.1 N/A
F2-CommNet 64 1 × 10 3 2 0.1 N/A
Table 5. Performance on Reddit and DBLP (mean ± 95% CI over 10 runs).
Table 5. Performance on Reddit and DBLP (mean ± 95% CI over 10 runs).
Method Reddit (ARI) DBLP (ARI)
GCN 0.38 ± 0.04 0.42 ± 0.03
GAT 0.41 ± 0.03 0.46 ± 0.04
DyGCN 0.49 ± 0.02 0.52 ± 0.03
EvolveGCN 0.53 ± 0.02 0.56 ± 0.02
F2-CommNet 0 . 64 ± 0 . 01 0 . 69 ± 0 . 02
Table 6. Stability and clustering performance on seven datasets. Higher ρ , ARI, NMI, Q are better; lower η max is better. Best results per row are in bold.
Table 6. Stability and clustering performance on seven datasets. Higher ρ , ARI, NMI, Q are better; lower η max is better. Best results per row are in bold.
Dataset Metric GCN GAT Spectral TGN DyGCN EvolveGCN F2-CommNet
Cora ρ 0.05 0.07 0.00 0.09 0.11 0.13 0.21
η max 0.42 0.39 0.51 0.35 0.33 0.30 0.28
ARI ↑ 0.68 0.70 0.62 0.72 0.74 0.75 0.73
NMI ↑ 0.71 0.74 0.65 0.76 0.77 0.79 0.83
Q 0.44 0.47 0.42 0.49 0.51 0.52 0.57
Citeseer ρ 0.04 0.06 0.00 0.08 0.10 0.12 0.19
η max 0.45 0.41 0.49 0.36 0.34 0.31 0.29
ARI ↑ 0.62 0.65 0.59 0.68 0.70 0.72 0.78
NMI ↑ 0.67 0.70 0.61 0.72 0.74 0.75 0.81
Q 0.40 0.43 0.39 0.45 0.47 0.48 0.55
PubMed ρ 0.06 0.08 0.00 0.11 0.12 0.14 0.23
η max 0.39 0.36 0.47 0.32 0.30 0.28 0.25
ARI ↑ 0.66 0.69 0.60 0.71 0.73 0.74 0.78
NMI ↑ 0.70 0.73 0.63 0.75 0.77 0.78 0.84
Q 0.42 0.45 0.40 0.47 0.49 0.50 0.59
Reddit ρ 0.07 0.09 0.00 0.12 0.14 0.15 0.17
η max 0.41 0.38 0.50 0.34 0.32 0.29 0.20
ARI ↑ 0.64 0.67 0.58 0.70 0.72 0.73 0.82
NMI ↑ 0.69 0.72 0.61 0.74 0.76 0.77 0.85
Q 0.41 0.44 0.38 0.46 0.48 0.49 0.58
Enron ρ 0.05 0.07 0.00 0.09 0.11 0.12 0.22
η max 0.44 0.40 0.52 0.37 0.35 0.33 0.34
ARI ↑ 0.60 0.63 0.57 0.66 0.68 0.69 0.74
NMI ↑ 0.65 0.68 0.60 0.70 0.72 0.73 0.82
Q 0.39 0.42 0.37 0.44 0.46 0.47 0.54
DBLP ρ 0.06 0.08 0.00 0.10 0.12 0.13 0.10
η max 0.40 0.37 0.48 0.34 0.32 0.30 0.26
ARI ↑ 0.65 0.68 0.60 0.71 0.73 0.74 0.81
NMI ↑ 0.69 0.72 0.62 0.74 0.76 0.77 0.84
Q 0.41 0.44 0.39 0.46 0.48 0.49 0.53
BioGRID ρ 0.05 0.07 0.00 0.09 0.11 0.13 0.16
η max 0.43 0.40 0.51 0.36 0.34 0.31 0.25
ARI ↑ 0.61 0.64 0.58 0.67 0.69 0.70 0.79
NMI ↑ 0.66 0.69 0.61 0.71 0.73 0.74 0.83
Q 0.40 0.43 0.38 0.45 0.47 0.48 0.57
Table 7. Count of metrics ( ρ , η max , ARI, NMI, Q) on which F2-CommNet is best for each dataset. Here, ↑ indicates higher is better, ↓ indicates lower is better. A denotes that F2-CommNet achieves the best score for that metric.
Table 7. Count of metrics ( ρ , η max , ARI, NMI, Q) on which F2-CommNet is best for each dataset. Here, ↑ indicates higher is better, ↓ indicates lower is better. A denotes that F2-CommNet achieves the best score for that metric.
Dataset ρ η max ARI ↑ NMI ↑ Q Wins/5
Cora 4
Citeseer 5
PubMed 5
Reddit 5
Enron 4
DBLP 4
BioGRID 5
Total wins 6 6 6 7 7 32/35
Table 10. Spectral mode suppression: hallucination indices η k under α = 1.0 and α = 0.7 .
Table 10. Spectral mode suppression: hallucination indices η k under α = 1.0 and α = 0.7 .
Eigenmode k 1 2 3 4 5 6 7 8 9 10
η k ( α = 1.0 ) -0.3 -0.1 0.0 0.2 0.5 0.8 1.0 1.3 1.6 1.9
η k ( α = 0.7 ) -0.5 -0.3 -0.2 0.0 0.2 0.4 0.6 0.7 0.9 1.0
Table 11. Error dynamics under perturbations: long-term error bound lim sup t e ( t ) .
Table 11. Error dynamics under perturbations: long-term error bound lim sup t e ( t ) .
Noise Level α = 1.0 (Integer) α = 0.7 (Fractional)
σ = 0.01 0.05 0.02
σ = 0.05 0.12 0.06
σ = 0.10 0.20 0.11
Table 12. Lyapunov function decay: values of V ( t ) at different time points.
Table 12. Lyapunov function decay: values of V ( t ) at different time points.
Time t α = 1.0 α = 0.7
0 1.00 1.00
5 0.61 0.45
10 0.37 0.20
15 0.22 0.10
20 0.13 0.05
Table 13. Fractional order sweep on the Cora dataset. Decreasing α enlarges stability margin ρ and suppresses hallucination index η max .
Table 13. Fractional order sweep on the Cora dataset. Decreasing α enlarges stability margin ρ and suppresses hallucination index η max .
Fractional order α Stability margin ρ Hallucination index η max
0.5 0.30 0.25
0.6 0.27 0.28
0.7 0.23 0.31
0.8 0.19 0.35
0.9 0.15 0.40
1.0 0.10 0.50
Table 14. Perturbation analysis on the Cora dataset. Fractional damping ( α = 0.8 ) suppresses noise growth.
Table 14. Perturbation analysis on the Cora dataset. Fractional damping ( α = 0.8 ) suppresses noise growth.
Method η max ρ
Integer-order ( α = 1.0 ) 0.47 0.08
Fractional-order ( α = 0.8 ) 0.29 0.20
Table 15. Spectral hallucination analysis on the Cora dataset. Fractional damping ( α = 0.8 ) stabilizes mid-frequency modes and reduces η max .
Table 15. Spectral hallucination analysis on the Cora dataset. Fractional damping ( α = 0.8 ) stabilizes mid-frequency modes and reduces η max .
Mode k Eigenvalue λ k Stability ( α = 1.0 ) Stability ( α = 0.8 )
Low-freq (1–2) 0.0–2.9 stable more stable
Mid-freq (3–5) 3.0–5.5 partly unstable stabilized
High-freq (6–10) 5.6–11.2 unstable unstable (reduced growth)
η max 0.42 0.28 (↓)
ρ margin 0.07 0.21 (↑)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated