Preprint
Review

This version is not peer-reviewed.

Matignon-Based Stability and Weight Synchronization of a Fractional Time Delay Neural Network Model

Submitted:

21 September 2025

Posted:

07 October 2025

You are already at the latest version

Abstract
Artificial neural networks (ANNs) are powerful models inspired by the structure and function of the human brain. They are widely used for tasks such as classification, prediction, and model recognition. This study examines the stability of fractional-order neural networks with neuronal conditions, dynamic behavior, synchronization, and delays of time
Keywords: 
;  ;  ;  ;  
Subject: 
Physical Sciences  -   Other

Introduction

Previous studies involved a variety of neurons with different delays. Many researchers are interested in analyzing neural networks. Many areas of science cover biological processes, automated control, brain modeling, sensor technology, computer vision, and more. [1,2,3]. Since 1984, Hopfield has presented many different classes of neural networks, but the dynamic properties of different types of neural connections have been shown to provide a wide range of applications in a variety of fields [4,5,6], including pattern recognition, control procedures, artificial intelligence, image processing, and medical science. In artificial neural networks, deadlines are often caused by inconsistencies in signal distribution between different neurons. To shed light on the reality of artificial neural networks [7], it is thought that neural network delays are made by a much better model compared to traditional neural networks without a temporary delay. Neural networks often exhibit much more, including periodic phenomena, unique chaotic properties, instability, and delays [8,9,10]. Currently, there is great interest in studying how different delays in the behavior of neural networks affect time delays. Currently, many neural network models are under construction and are being studied, with some interesting research published. Several researchers [11] have studied the stability of wave outcomes in many models of neural networks based on distributed delayed cells. Various researchers [12,13] solve the problem of neural network synchronization.
As a result, there has been a growing interest in studying the latency of deep neural networks in recent years. Many important Indian artifacts have been created with brain structures that have been held in recent years. Global stabilization has been studied in quaternary artificial neural networks with temporary delays [14,15] using impulsive behavior and investigates modified saliva control to achieve complex delays in BAM structures. The research is a rather reliable criterion for ensuring the presence of global indices and periodic results of the quaternary nervous system with delays [16,17,18]. Stability analysis is performed due to the merits of regular outcomes of a particular type of discrepancy, the merits of the structures held by the BAM cognitive system compared to the D-operator, but the anti-periodic results of artificial neural networks, such as time and pulse, are also considered. The study randomly explores the stability of BAM network models jumping with delays [19,20,21].
Although the previously mentioned works have addressed some dynamic problems associated with delay, they concentrate on the integer-order component. In recent years, several academics have proposed that fractional-order differential equations are a better instrument for exposing the real-world linkages of dynamical systems since they are able to clarify over time ways of change and memory [22]. Recently, fractional calculus has been widely employed in different areas such as biology, electromagnetic waves, electrical technology, neuroscience, and financial management development [23,24,25]. Recently, fractional-order neural nets that handle delays have been the subject of appropriate research. Researchers [26] examined the global stability of fractional impulse delayed artificial neural networks and also investigated Mittag-Leffler transfer data in fractional-order octonion-valued [27] simulated neural network performance. The effect of leaking delay upon the Hopf bifurcation of fractional-order quaternion-valued computational neural networks [28]. Many researchers have found that voice, senses, robotics, understanding patterns, vision, and visual processing are very useful in the field of psychology [29,30].
In 1695, Leibniz and L’ L’Hospital traditionally explored the classical calculations that provided the main theory of fractional calculations. All natural explanations detected by fractional calculations are detected more accurately and accurately than regular calculations [31]. In 1832, the results were used to understand some mathematical problems. In 1892, Oliver published and developed a definition of division in a series of works. The main motivation for the invention of the entire Ford model was the lack of calculation results [32]. Using fractional-ordering, the ratio of voltage and currents for the most recent line of unstable half of the transmission is a great case [33,34]. Fractional derivatives can be easily modeled using a variety of numerical methods. Fractional calculus is used in research and research involving the fields of robotics, chemical interactions, biology, automation, technical theory, chaotic theory, and fractal structures. A system in which some variable fractional-orders are not proportional to one another is called incommensurate fractional-orders [35,36].
In previous studies, this indicates that most authors address the proportions with overall order issues and delays in neural networks. These models have been studied for fractional-orders of arbitrary values. This article provides a complex method for calculating inappropriate fractional-orders using clear points corresponding to unique values. Numerical analysis of neural network models occurs in the resulting stability domain based on these fractional-orders. Incommensurable fractional-orders help quickly and accurately converge your digital solutions. Using neural network models for incommensurable fractional-ordering, this study opens the way for new research fields over time. For fractional-order, calculated from neural network weights, the system remains synchronized and stable. The stable and convergent signals are shown in the graphical description of the synchronized scale of neural networks. This article investigates the stability of these networks with the evolution of temporary delays. In a fractional system, it is not an easy task to calculate the incommensurable fractional-orders.

Stability Analysis

Before discussing the reliability of the neural network model, it is impossible to calculate the incommensurate fractional-orders q i q j , where i j . For fixed values of the included parameters for the neural network model leads to a fractional-order system with the following form:
D q 1 w 1 t = 12.2 w 1 + 0.21 w 4 t 0.001 + 2.6 w 5 t 0.001 + 2.6 w 6 t 0.001 D q 2 w 1 t = 12.2 w 2 + 0.21 w 5 t 0.001 + 2.6 w 6 t 0.001 + 2.6 w 4 t 0.001 D q 3 w 1 t = 12.2 w 3 + 0.21 w 6 t 0.001 + 2.6 w 4 t 0.001 + 2.6 w 5 t 0.001 D q 4 w 1 t = 12.2 w 4 + 0.21 w 1 t 0.001 + 2.6 w 2 t 0.001 + 2.6 w 3 t 0.001 D q 5 w 1 t = 12.2 w 5 + 0.21 w 2 t 0.001 + 2.6 w 3 t 0.001 + 2.6 w 1 t 0.001 D q 6 w 1 t = 12.2 w 6 + 0.21 w 3 t 0.001 + 2.6 w 1 t 0.001 + 2.6 w 2 t 0.001
(1.1)
Select a random value from the parameters contained in the formats α = 12.2 ,   σ = 0.001 , a = 0.21 ,   and   b = 2.6 . Various singular points can be found in the model using the Jacobian matrix. Based on these balance points, the upper bounds of inappropriate fractional-order are calculated. q 1   =   0.965 ,   q 2 =   0.780 ,   q 3 =   0.800 ,   q 4 =   0.785 ,   q 5 =   0.789 , and q 6   =   1.58 .
The stability region is defined by the Matignon-based stability [37].
| a r g ( λ i ) | > q i π 2 for   all   i ,
(1.2)
where λ i is the eigenvalue and q i is the incommensurate fractional-orders.
The corresponding fractional-order are calculated as q 1   =   0.965 ,   q 2 =   0.780 ,   q 3 =   0.800 ,   q 4 =   0.785 ,   q 5 =   0.789 , and q 6   =   1.58 from the inequality (3.3). The incommensurate fractional-orders from q 1 to q 5 are physically stable because they lie in the first Riemann sheet region. On the other hand, q 6 does not lie in the stable region, which is not physical. It is possible to improve the reliability and stability of neural networks for a variety of practical purposes.
Figure 1. Stability Region for Fractional-Order.
Figure 1. Stability Region for Fractional-Order.
Preprints 177630 g001

Mathematical Formulation

The neural network system is of the form [38]:
x ˙ n ( t ) = μ m x n ( t ) + j = 1 k c n m g m ( y n ( t τ n m ) + I m , m = 1,2 , 3 , , l . y ˙ n ( t ) =   ν m   y n ( t ) + j = 1 k d m n h n ( x m ( t ν m n ) + J n , n = 1,2 , 3 , . . . ; p .
(1.3)
where µ m , and ν m represents the stability of internal neuron activities on the I-layer and J-layer, respectively. τ n m ,   ν m n are time delays and c n m ,   d m n   ( m = 1,2 , 3 , . . . ; l ;   n = 1,2 , 3 , . . . ; p ) indicate connecting weights through neurons. In two layers, the I-layer and the J-layer. The neurons on the I-layer whose states have been designated by x n ( t ) receive the inputs I n and the inputs that those neurons in the J-layer output through activation functions g m , while the neurons on the J-layer whose associated states are revealed by y n ( t ) obtain the inputs J n and the inputs that those neurons in the I-layer output via activation functions h n .
The schematic diagram 1.2 illustrates the main mechanisms of artificial neurons, the basic structural unit of neural networks. It starts with some input signals x 1 , x 2 , x 3 , x 4 , x 5 , and x 6 , each associated with the corresponding weight w 1 , w 2 , w 3 , w 4 , w 5 , and w 6 . An Artificial Neural Network consists of two key parts: one part is called the summation part, and the other is called the function part. Consider x n (for n = 1 , 2 , 3 , . . . ; 6 ) the neurons, then they are connected with their respective weight w n (for n = 1 , 2 , 3 , . . . ; 6 ), which is like w 1 connected with x 1 , w 2 connected with x 2 and so on. The summation part calculates the weighted sum in this form: x 1 w 1 + x 2 w 2 + . . . + x n w n and then forwards this weighted sum into the function part. The function part has a function called an activation function g , which generates a particular output for a given input. The arrows in the diagram indicate the flow of input data through the total and activation stages at the final output y n . This process models how artificial neurons mimic human mind, integrate several signals, and make decisions.
Figure 2. Schematic Diagram.
Figure 2. Schematic Diagram.
Preprints 177630 g002
The formulation of a model for an artificial neural network is the topic discussed in this part. The model concerning the delayed neural networks is given [39]:
d w 1 d t = α w 1 t + g [ w 1 ( t σ ) ] + h [ w 5 ( t σ ) ] + h [ w 6 ( t σ ) ] , d w 2 d t = α w 2 t + g w 5 t σ + h w 6 t σ + h w 4 t σ , d w 3 d t = α w 3 t + g w 6 ( t σ ) + h w 4 ( t σ ) + h w 5 t σ , d w 4 d t = α w 4 t + g w 1 ( t σ ) + h w 2 ( t σ ) + h w 3 t σ , d w 5 d t = α w 5 t + g w 2 ( t σ ) + h w 3 ( t σ ) + h w 1 t σ , d w 6 d t = α w 6 t + g w 3 ( t σ ) + h w 1 ( t σ ) + h w 2 t σ ,
(1.4)
where α > 0   represents the training parameter, g and h are the activation functions.
In general, consider g u = a + a 1 u 3 and h u = b + b 1 u 3 and where a ,   a 1 , b and b 1 are all real. The system (1.4) takes the form:
d w 1 d t = α w 1 t + a w 4 t σ + b w 5 t σ + b w 6 t σ d w 2 d t = α w 2 t + a w 5 t σ + b w 6 t σ + b w 4 t σ d w 3 d t = α w 3 t + a w 6 t σ + b w 4 t σ + b w 5 t σ d w 4 d t = α w 4 t + a w 1 t σ + b w 2 t σ + b w 3 t σ d w 5 d t = α w 5 t + a w 2 t σ + b w 3 t σ + b w 1 t σ d w 6 d t = α w 6 t + a w 3 t σ + b w 1 t σ + b w 2 t σ
(1.5)
The system (1.5) can be modified for incommensurate fractional-orders as:
d q 1 w 1 d t q 1 = α w 1 t + a w 4 t σ + b w 5 t σ + b w 6 t σ , d q 2 w 2 d t q 2 = α w 2 t + a w 5 t σ + b w 6 t σ + b w 4 t σ , d q 3 w 3 d t q 3 = α w 3 t + a w 6 t σ + b w 4 t σ + b w 5 t σ , d q 4 w 4 d t q 4 = α w 4 t + a w 1 t σ + b w 2 t σ + b w 3 t σ , d q 5 w 5 d t q 5 = α w 5 t + a w 2 t σ + b w 3 t σ + b w 1 t σ , d q 6 w 6 d t q 6 = α w 6 t + a w 3 t σ + b w 1 t σ + b w 2 t σ ,
(1.6)
where q i ,   ( i = 1,2 , 3 , . . . ; 6 ) , are the incommensurate fractional-orders.

Numerical Solution

For numerical calculation of fractional-order derivatives, the required equation can be derived from the Grunwald-Letnikov fractional-order derivative. The relation to the explicit numerical approximation of q t h derivative at the points k h , ( k   =   1 ,   2 ,   3 , ) has the following form [40]:
( k L m / h ) D t k q S t h q i = 0 k 1 i C i q S t k i
(1.7)
where L m is the “memory length”, t k   =   k h , h is the time step of calculation, and 1 i C i q are binomial coefficients C i ( q ) ( i   =   0,1 , 2 , ). For their calculation, we can use the following expression [40]:
C 0 ( q ) = 1 ,   C i ( q ) = 1 1 + q i , C i 1 q .
(1.8)
The general numerical solution of the fractional differential equation is
a D t q y t = S y t , t ,
(1.9)
The numerical solution can be expressed as:
y t k = S y t k , t k h q i = E k C i q S t k i .
(1.10)
For the memory term expressed by the sum, a short memory principle can be used.
In addition, it uses a more convenient numerical solution of the fractional-order Eq. (1.1), which is based on the Grunwald-Letnikov method. It will take the following form:
W 1 t k =   12.2 W 1 + 0.21 W 4 t k 1 0.001 + 2.6 W 5 t k 1 0.001 +   2.6 W 6 t k 1 0.001 h q 1 i = E k C i q 1 W 1 t k i
(1.11)
W 2 t k = 12.2 W 2 + 0.21 W 5 t k 1 0.001 + 2.6 W 6 t k 1 0.001 + 2.6 W 4 t k 1 0.001 h q 2 i = E k C i q 2 W 2 t k i
(1.12)
W 3 t k = 12.2 W 1 + 0.21 W 6 t k 1 0.001 + 2.6 W 4 t k 1 0.001 + 2.6 W 5 t k 1 0.001 h q 3 i = E k C i q 3 W 3 t k i
(1.13)
W 4 t k = 12.2 W 4 + 0.21 W 1 t k 1 0.001 + 2.6 W 2 t k 1 0.001 + 2.6 W 3 t k 1 0.001 h q 4 i = E k C i 4 W 4 t k i
(1.14)
W 5 t k = 12.2 W 5 + 0.21 W 2 t k 1 0.001 + 2.6 W 3 t k 1 0.001 + 2.6 W 1 t k 1 0.001 h q 5 i = E k C i q 5 W 5 t k i
(1.15)
W 6 t k = 12.2 W 6 + 0.21 W 3 t k 1 0.001 + 2.6 W 1 t k 1 0.001 + 2.6 W 2 t k 1 0.001 h q 6 i = E k C i q 6 W 6 t k i
(1.16)
where k   =   1,2 , 3 , . . . ; N , and ( W 1 0 ,   W 2 0 ,   W 3 0 , W 4 0 ,   W 5 0 ,   W 6 0 ) is the start point. The binomial coefficients C i q j i are calculated according to Eq. (1.8). All simulations were performed for time step h   =   0.00001 .

Discussion

The evolution of the neuron state w 1 ( t ) over time, approximately 2300 units, is depicted in time series results plot 1.3 (a). Initially, with t   =   200 , the state parameter w i fluctuates between 0   and   2 . By t   =   600 , the amplitude increases by about 45 units, indicating heightened neuron activity. The variation stabilizes around 250 units, peaking approximately 40 units post-transition, influenced by a fractional-orders qi affecting the system’s behavior. From time 0 to 2400, w 6 ( t ) oscillates with amplitude increasing from ± 0.2   to   ± 1.5   by   t   =   2300 . The signal contains frequencies between 0.02 Hz and 0.12 Hz, indicating propagation and resonance, which highlights the neural network’s dynamic adaptability. Plot 1.3 (b) reveals a unique repeated modulation pattern and smooth pulsating behavior, suggesting steady interaction frequencies from a fractional-order neural network. The oscillations in graph 1.3 (c) of w 5 ( t ) vary from 7   to   7 over 0 to 2400 temporal units. At t   =   2300 , periodic amplitude modulation occurs, with peaks rising from about 1 to 7 . Interference patterns reveal beat frequencies from multiple sinusoidal components at 0.02 Hz and 0.08 Hz. From t = 0 to t = 2000 units in (d), the state variable w 2 ( t ) exhibits oscillations with amplitudes between 0.5   and   3.5 . Fluctuations are erratic when t   <   500 , peaking at 1.2 and around 0.01 Hz. The variation is stabilized by about 3 units at t = 500, forming an almost auxiliary image with a frequency of about 0.004 Hz, reflecting the transition of the synchronized state. Sketch 1.3 (e) represents the variable state w 3 ( t ) that fluctuates between 4.5   and   4.5   in   2400 units. The amplitude modulation passes from about 1 to about 4.5 around t   =   2200 , forming the shape of the shell. The frequency of vibrations varies from 0.02 Hz to 0.1 Hz, revealing the effects of typical memory and nonlinear feedback in fractional-order neural networks. The envelope period of about 400 units indicates synchronization regime changes significant for modeling brain dynamics. The amplitude oscillations in 1.3 (f), w 4 ( t ) range from 5   to   5 and peaking at nearly 5 by t   =   2300 . The shape of the waves shows modulation similar to the envelope with several frequency components that exhibit irregular interactions common in fractional-order neural networks. Significant oscillation frequency is between 0.08 Hz and 0.015 Hz, indicating the transition from an inactive oscillating neural state to a controlled one.
In plot 1.4(a), w 2 ( t ) versus w 3 ( t ) shows a limited elliptical trajectory that is central at the beginning of the coordinates, fluctuating between ± 3   and   ± 2 , indicating stable behavior. This closed-loop pattern demonstrates synchronization between neuron states influenced by fractional-order dynamics. The system’s smooth evolution in a limited phase space aligns with stability criteria. The phase graph 1.4 (b) of w 2 ( t ) and w 1 ( t ) illustrates a bounded elliptical pattern centered at the origin, with w 2 ( t ) values ranging from 1.5   to   1.5 and w1(t) from 2.5   to   2.5 , indicating stable, periodic oscillations influenced by fractional-order dynamic model. The sketch 1.4 (c) displays a dense elliptical shape for w 6 ( t ) and w 5 ( t ) , indicating confined states and system stability. Different dynamical scales lead to regular oscillations without chaotic behavior.
Figure 3. Time Evolution of w i Stable Oscillations with Complex Amplitude Modulation.
Figure 3. Time Evolution of w i Stable Oscillations with Complex Amplitude Modulation.
Preprints 177630 g003aPreprints 177630 g003b
Figure 4. Weighted Synchronization of State Variables w 2 ( t ) , w 3 ( t ) , w 4 ( t ) , w 5 ( t )   and w 6 ( t ) .
Figure 4. Weighted Synchronization of State Variables w 2 ( t ) , w 3 ( t ) , w 4 ( t ) , w 5 ( t )   and w 6 ( t ) .
Preprints 177630 g004
The phase sketch 1.5 (a) of w 6 ( t ) versus w 5 ( t ) displays a dense elliptical shape, indicating strong correlation and synchronization, with values between 15   and   15 . This pattern’s fine oscillation suggests multiple frequency components interacting nonlinearly, characteristic of fractional-order neural networks influenced by memory effects with a fractional-order q around 0.9. In (b), w 4 ( t ) against w 3 ( t ) presents a confined elliptical pattern between 5   and   5 , indicating bounded trajectories, which denotes stability without divergence. The multiple interacting frequencies of the system operates in a stable, synchronized regime rather than exhibiting chaotic behavior.
Figure 5. Weighted Synchronization of State Variables w 3 ( t )
Figure 5. Weighted Synchronization of State Variables w 3 ( t )
Preprints 177630 g005

Conclusion

The neuron states w i become unstable whenever the incommensurate fractional-orders q i exceeds their upper bounds. A congested stability within the interval ( 1.5,1.5 ) and a chaotic effect beyond this interval of neuron state w 2 ( t ) synchronized against w 3 ( t ) are reported. The reduction of the time delay will give more relaxation time σ for the state variable w 4 ( t ) to be stable as compared to the state variable w 3 ( t ) . The exclusion of two neuron states w 5 ( t ) and w 6 ( t ) , enhances the stability of the system. The enhancement of stability in (b) is reasoned out by the addition of the extra parameter b . The setup of the parameter α at 12.2 enabled the trained the system consistent and allow the model to converge successfully without ambiguity.

Nomenclature

Symbols Representations
D q i Fractional-order derivative.
q i Incommensurate fractional-orders
a ,   b Real numbers
t Time
σ Time delay
w i State Variables or Neuron States
α Training Parameter
g ,   h Activation Functions
c n m ,   d m n Connecting Weights Through Neurons
µ m ,   ν m Stability of Internal Neuron Activities

References

  1. Li, Y.; Shen, S. Almost automorphic solutions for Clifford-valued neutral-type fuzzy cellular neural networks with leakage delays on time scales. Neurocomputing 2020, 417, 23–35. [Google Scholar]
  2. Xiu, C.; Zhou, R.; Liu, Y. New chaotic memristive cellular neural network and its application in a secure communication system. Chaos, Solitons & Fractals 2020, 141, 110316. [Google Scholar]
  3. Ji, L.; Chang, M.; Shen, Y.; Zhang, Q. Recurrent convolutions of binary-constrained cellular neural network for texture recognition. Neurocomputing 2020, 387, 161–171. [Google Scholar] [CrossRef]
  4. Kumar, R.; Das, S. Exponential stability of inertial BAM neural network with time-varying impulses and mixed time-varying delays via matrix measure approach. Communications in Nonlinear Science and Numerical Simulation 2020, 81, 105016. [Google Scholar] [CrossRef]
  5. Xu, C.; Liao, M.; Li, P.; Liu, Z.; Yuan, S. New results on pseudo almost periodic solutions of quaternion-valued fuzzy cellular neural networks with delays. Fuzzy Sets and Systems 2021, 411, 25–47. [Google Scholar] [CrossRef]
  6. Kobayashi, M. Complex-valued Hopfield neural networks with real weights in synchronous mode. Neurocomputing 2021, 423, 535–540. [Google Scholar] [CrossRef]
  7. Cui, W.; Wang, Z.; Jin, W. Fixed-time synchronization of Markovian jump fuzzy cellular neural networks with stochastic disturbance and time-varying delays. Fuzzy Sets and Systems 2021, 411, 68–84. [Google Scholar] [CrossRef]
  8. Huang, C.; Su, R.; Cao, J.; Xiao, S. Asymptotically stable high-order neutral cellular neural networks with proportional delays and D-operators. Mathematics and Computers in Simulation 2020, 171, 127–135. [Google Scholar] [CrossRef]
  9. Meng, B.; Wang, X.; Zhang, Z.; Wang, Z. Necessary and sufficient conditions for normalization and sliding mode control of singular fractional-order systems with uncertainties. Science China Information Sciences 2020, 63, 1–10. [Google Scholar] [CrossRef]
  10. Hsu, C.H.; Lin, J.J. Stability of traveling wave solutions for nonlinear cellular neural networks with distributed delays. Journal of Mathematical Analysis and Applications 2019, 470, 388–400. [Google Scholar] [CrossRef]
  11. Li, Y.; Qin, J. Existence and global exponential stability of periodic solutions for quaternion-valued cellular neural networks with time-varying delays. Neurocomputing 2018, 292, 91–103. [Google Scholar] [CrossRef]
  12. Tang, R.; Yang, X.; Wan, X. Finite-time cluster synchronization for a class of fuzzy cellular neural networks via non-chattering quantized controllers. Neural Networks 2019, 113, 79–90. [Google Scholar] [CrossRef]
  13. Wang, W. Finite-time synchronization for a class of fuzzy cellular neural networks with time-varying coefficients and proportional delays. Fuzzy Sets and Systems 2018, 338, 40–49. [Google Scholar] [CrossRef]
  14. Wang, S.; Zhang, Z.; Lin, C.; Chen, J. Fixed-time synchronization for complex-valued BAM neural networks with time-varying delays via pinning control and adaptive pinning control. Chaos Solitons & Fractals 2021, 153, 111583. [Google Scholar]
  15. Zhao, R.; Wang, B.; Jian, J. Global stabilization of quaternion-valued inertial BAM neural networks with time-varying delays via time-delayed impulsive control. Mathematics and Computers in Simulation 2022, 202, 223–245. [Google Scholar] [CrossRef]
  16. Li, Y.; Qin, J. Existence and global exponential stability of periodic solutions for quaternion-valued cellular neural networks with time-varying delays. Neurocomputing 2018, 292, 91–103. [Google Scholar] [CrossRef]
  17. Kong, F.; Zhu, Q.; Wang, K.; Nieto, J.J. Stability analysis of almost periodic solutions of discontinuous BAM neural networks with hybrid time-varying delays and D-operator. Journal of the Franklin Institute 2019, 356, 11605–11637. [Google Scholar] [CrossRef]
  18. Xu, C.; Zhang, Q. On the antiperiodic solutions for Cohen-Grossberg shunting inhibitory neural networks with time-varying delays and impulses. Neural Computation 2014, 26, 2328–2349. [Google Scholar] [CrossRef]
  19. Ali, M.S.; Yogambigai, J.; Saravanan, S.; Elakkia, S. Stochastic stability of neutral-type Markovian-jumping BAM neural networks with time-varying delays. Journal of Computational and Applied Mathematics 2019, 349, 142–156. [Google Scholar] [CrossRef]
  20. Cong, E.Y.; Han, X.; Zhang, X. Global exponential stability analysis of discrete-time BAM neural networks with delays: A mathematical induction approach. Neurocomputing 2020, 379, 227–235. [Google Scholar] [CrossRef]
  21. Ayachi, M. Measure-pseudo almost periodic dynamical behaviors for BAM neural networks with D operator and hybrid time-varying delays. Neurocomputing 2022, 486, 160–173. [Google Scholar] [CrossRef]
  22. Shi, J.; He, K.; Fang, H. Chaos, Hopf bifurcation, and control of a fractional-order delay financial system. Mathematics and Computers in Simulation 2022, 194, 348–364. [Google Scholar] [CrossRef]
  23. Xiao, J.; Wen, S.; Yang, X.; Zhong, S. New approach to global Mittag-Leffler synchronization problem of fractional-order quaternion-valued BAM neural networks based on a new inequality. Neural Networks 2020, 122, 320–337. [Google Scholar] [CrossRef]
  24. Xu, C.; Mu, D.; Pan, Y.; Aouiti, C.; Pang, Y.; Yao, L. Probing into bifurcation for fractional-order BAM neural networks concerning multiple time delays. Journal of Computational Science 2022, 62, 101701. [Google Scholar] [CrossRef]
  25. Xu, C.; Liao, M.; Li, P.; Guo, Y.; Liu, Z. Bifurcation properties for fractional-order delayed BAM neural networks. Cognitive Computation 2021, 13, 322–356. [Google Scholar] [CrossRef]
  26. Wang, F.; Yang, Y.; Xu, X.; Li, L. Global asymptotic stability of impulsive fractional-order BAM neural networks with time delay. Neural Computing and Applications 2017, 28, 345–352. [Google Scholar] [CrossRef]
  27. Ye, R.; Liu, X.; Zhang, H.; Cao, J. Global Mittag-Leffler synchronization for fractional-order BAM neural networks with impulses and multiple variable delays via delayed-feedback control strategy. Neural Processing Letters 2019, 49, 1–18. [Google Scholar] [CrossRef]
  28. Xu, C.; Liu, Z.; Aouiti, C.; Li, P.; Yao, L. ; Yan New exploration on bifurcation for fractional-order quaternion-valued neural networks involving leakage 22 delays. Cognitive Neurodynamics 2022, 16, 1233–1248. [Google Scholar] [CrossRef] [PubMed]
  29. Xiao, J.; Guo, X.; Li, Y.; Wen, S.; Shi, K. ; Tang, Y Extended analysis on the global Mittag-Leffler synchronization problem for fractional-order octonion-valued BAM neural networks. Neural Networks 2022, 154, 491–507. [Google Scholar] [CrossRef]
  30. Popa, C.A. Mittag-Laffler stability and synchronization of neutral-type fractional-order neural networks with leakage delay and mixed delays. Journal of the Franklin Institute 2023, 360, 327–355. [Google Scholar] [CrossRef]
  31. Ci, J.; Guo, Z.; Long, H.; Wen, S.; Huang, T. Multiple asymptotic periodicities of fractional-order delayed neural networks under state-dependent switching. Neural Networks 2023, 157, 11–25. [Google Scholar] [CrossRef]
  32. Shah, D.K.; Vyawahare, V.A.; Sadanand, S. Artificial neural network approximation of special functions: design, analysis, and implementation. International Journal of Dynamics and Control 2025, 13, 1–23. [Google Scholar] [CrossRef]
  33. Admon, M.R.; Senu, N.; Ahmadian, A.; Majid, Z.A.; Salahshour, S. A new and modern scheme for solving fractal-fractional differential equations based on a deep feedforward neural network with multiple hidden layers. Mathematics and Computers in Simulation 2024, 218, 311–333. [Google Scholar] [CrossRef]
  34. Maurya, S.S.; Kannan, J.B.; Patel, K.; Dutta, P.; Biswas, K.; Santhanam, M.S.; U. D. Asymmetric dynamical localization and precision measurement of the micromotion of a Bose-Einstein condensate. Physical Review A 2024, 110, 053307. [Google Scholar] [CrossRef]
  35. Joshi, D.D.; Bhalekar, S.; Gade, P.M. Stability analysis of fractional differential equations with delay. Chaos: An Interdisciplinary Journal of Nonlinear Science, 2024; 34. [Google Scholar]
  36. Chettouh, B. (2024). Stability, Bifurcations and Control in Fractional-order Chaotic Systems (Doctoral dissertation, Université Mohamed Khider (Biskra-Algérie)).
  37. Ur Rahman, H.; Shuaib, M.; Ismail, E.A.; Li, S. Enhancing medical ultrasound imaging through fractional mathematical modeling of ultrasound bubble dynamics. Ultrasonics Sonochemistry 2023, 100, 106603. [Google Scholar] [CrossRef]
  38. Gopalsamy, K.; He, X.Z. Delay-dependent stability in bidirectional associative memory networks. IEEE Transactions on Neural Networks 1994, 5, 998–1002. [Google Scholar] [CrossRef] [PubMed]
  39. Zhang, C.; Zheng, B.; Wang, L. Multiple Hopf bifurcations of a symmetric BAM neural network model with delay. Applied Mathematics Letters 2009, 22, 616–622. [Google Scholar] [CrossRef]
  40. Vinagre, B.M.; Chen, Y.Q.; Petráš, I. Two direct Tustin discretization methods for a fractional-order differentiator/integrator. Journal of the Franklin Institute 2003, 340, 349–362. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated