1. Introduction
The profound interconnection between Fourier analysis and dynamical systems has historically yielded fundamental insights into both mathematical theory and physical applications. Although Fourier transforms are widely employed in frequency-domain analysis, their potential to preserve the geometric invariants of Hamiltonian systems remains underexplored. This issue is particularly significant in modern interdisciplinary applications, where nonlinear dynamics intersect with data science. In such contexts, spectral techniques must balance structural fidelity with the extraction of intricate features—a challenge that conventional methodologies often do not fully address.
Building on classical results that link Poisson structures with non-commutative Fourier transforms, this work extends these principles to higher-dimensional phase spaces, thereby broadening foundational theorems to encompass more complex geometric contexts. These theoretical advancements underlie the development of adaptive spectral algorithms designed to preserve symplectic two-forms while detecting emergent bifurcations and chaotic behaviors. Rather than representing mere technical refinements, these algorithms offer a conceptual synthesis that bridges geometric mechanics with spectral analysis to tackle longstanding challenges in dynamical diagnostics..
To illustrate the versatility of this framework, we present case studies in quantum state tomography, signal processing, and neural Hamiltonian networks. These applications demonstrate not only the robustness of our approach but also its compatibility with conventional FFT methodologies. By emphasizing the intrinsic geometry of dynamical systems, this work refines spectral analysis techniques and provides fresh perspectives on understanding complexity in both natural and engineered systems.
2. State of the Art
Over the past century, dynamical systems and spectral analysis have become increasingly intertwined. Poincaré’s work [
1] revealed chaos, Marsden [
2] unified analysis and geometry, and Arnold [
3], along with Folland [
4], established key links between Hamiltonian systems and spectral methods.
While the Fourier transform remains fundamental in frequency analysis, it struggles to capture transient phenomena. To address this, multiscale techniques have emerged. Daubechies’ wavelet theory [
5] improved time-frequency localization, while Lichtenberg and Lieberman [
6] demonstrated how these methods reveal hierarchical structures in chaotic dynamics. Despite these advances, fully resolving rapidly evolving signals remains a challenge.
Moreover, chaos theory highlights the limitations of linear spectral tools in non-integrable regimes. A hybrid approach, combining geometric invariants and statistical methods, provides a more effective framework for analyzing complex dynamics.
This work aims to develop a unified framework that integrates Hamiltonian dynamics, spectral analysis, and algebraic structures. By combining these perspectives, we enhance the understanding of dynamical systems and enable new applications in fields such as quantum computing and machine learning.
3. Notations and Preliminaries
To ensure terminological and conceptual consistency throughout this paper, we introduce the notations, definitions, and fundamental results that will serve as a reference for the subsequent sections.
3.1. Phase Spaces and Symplectic Forms
Let be a symplectic manifold, where:
A canonical example is the classical phase space
, equipped with the standard symplectic form:
3.2. Poisson Brackets
On the symplectic manifold
, the Poisson bracket of two functions
is defined as:
where the Hamiltonian vector fields
and
are given by:
The Poisson bracket satisfies the following fundamental properties:
3.3. Fourier Transform
The Fourier transform of a function
is given by:
Within our study, we emphasize the role of the Fourier transform in the Hamiltonian framework, particularly in preserving dynamical invariants.
3.4. Proofs and Useful Results
As an illustrative example, we recall the proof of the nondegeneracy of the standard symplectic form on .
Proposition 1. The canonical symplectic form on is nondegenerate.
Proof. To verify nondegeneracy, we consider a vector field
and compute its interior product with
:
If
, then for all
i,
Thus,
, confirming that
is nondegenerate. □
This foundational discussion establishes the common framework upon which we will develop the Hamiltonian formalism and Fourier transform techniques, ensuring consistency in notation and concepts throughout the paper.
4. Theoretical Foundations
In this section, we establish the essential theoretical framework underpinning our approach. We articulate two complementary perspectives: the Hamiltonian formalism, which structures the dynamics of the systems under study, and the algebraic structure, which reveals the underlying symmetries and their role in spectral analysis. Together, these perspectives provide a robust and coherent foundation for our developments.
4.1. Hamiltonian Formalism
We consider phase spaces denoted
, where
M is a smooth manifold of dimension
equipped with a nondegenerate symplectic 2-form
. In the classical setting
, the canonical symplectic structure is given by:
This structure plays a fundamental role in describing the evolution of dynamical systems [
2].
Phase Spaces and Dynamic Invariants: The symplectic form ensures the conservation of invariants during Hamiltonian evolution. We illustrate these concepts with concrete examples, emphasizing the role of symplectic geometry in system dynamics.
-
Poisson Brackets: For smooth functions
, the Poisson bracket is defined as
where the Hamiltonian vector fields
and
are defined by
This bracket satisfies antisymmetry, the Jacobi identity, and the Leibniz rule, making it central to the preservation of system invariants.
Connection with the Fourier Transform: The Hamiltonian formalism naturally integrates with spectral analysis. Specifically, the Fourier transform (FFT) can be interpreted as an operation that, under certain conditions, preserves dynamic invariants, highlighting a fundamental link between system dynamics and frequency analysis [
3].
4.2. Algebraic Structure
Complementing the dynamical perspective, we examine the algebraic structure underlying spectral analysis.
By integrating these two perspectives—the Hamiltonian formalism and algebraic structures—we establish a unified theoretical framework supporting the analytical and numerical developments presented in the remainder of this paper.
5. Dynamic Analysis
This section develops a unified framework for characterizing dynamical behavior, synthesizing the algebraic architecture of Fourier transforms with the temporal evolution of physical systems. Moving beyond a mere enumeration of techniques, we craft a cohesive narrative that distinguishes integrable from non-integrable regimes and interrogates the intricate interplay of transitions and bifurcations governing the onset of chaos.
5.1. Integrable Systems
Integrable systems derive their analytical tractability from the existence of
n mutually commuting invariants
, which satisfy the Poisson commutativity condition:
This property facilitates a canonical reparametrization using action-angle coordinates
, simplifying both phase-space trajectory analysis and spectral decomposition. The structural coherence of these systems provides an ideal setting for developing geometrically consistent diagnostic tools.
5.1.1. Adaptive Fourier Transform
Building on the inherent regularity of action-angle coordinates, we propose an adaptive Fourier transform defined as
where
is a symplectically compatible volume measure on
[
3]. By aligning spectral resolution with the geometry of phase space, this method detects subtle perturbations in dynamical invariants—such as adiabatic variations in angular momentum—offering unprecedented sensitivity for integrable regimes.
5.1.2. KAM Stability
The Kolmogorov-Arnold-Moser (KAM) theorem establishes that invariant tori persist under small perturbations
to an integrable Hamiltonian
, provided the unperturbed frequency
satisfies a Diophantine condition
(for
) and
[
8]. Such resilience not only highlights the robustness of quasi-periodic motion but also guides the design of spectral tools for identifying near-integrable structures in chaotic environments.
While the KAM theorem provides a powerful framework for understanding the persistence of ordered motion in nearly integrable systems, many physical systems exhibit a more intricate interplay between regularity and chaos. In such non-integrable systems, the Hamiltonian flow can be decomposed into regular and chaotic components, offering a nuanced perspective on their dynamical behavior.
5.2. Non-Integrable Systems
Non-integrable systems exhibit a fascinating duality, intertwining ordered and chaotic dynamics. To disentangle this complexity, we decompose the Hamiltonian flow
as
where
captures predictable trajectories and
encodes stochasticity. This decomposition enables targeted spectral analysis of each dynamical component, revealing hidden patterns in mixed regimes.
5.2.1. Multiscale Fourier Transform
To resolve transient and multiscale features in chaotic signals, our approach employs a windowed Fourier transform:
where the adaptive window
scales with
to localize frequency content [
5]. This technique bridges micro-scale fluctuations (e.g., Lyapunov exponent variations) with macro-scale instabilities, uncovering hierarchical structures in chaotic attractors.
5.2.2. Analysis of Chaotic Structures
The analysis of chaotic regimes necessitates a multi-faceted diagnostic approach. A key metric for quantifying erratic dynamics is the spectral persistence, defined as:
This metric measures the time-averaged spectral intensity. To complement this spectral approach, techniques from [
9] are employed to reconstruct invariant manifolds. Furthermore, resonance analysis, such as that used to identify overlapping resonances in galactic dynamics, helps reveal nonlinear interactions that amplify chaotic behavior. Collectively, these methods decompose chaotic dynamics into measurable components, linking microscopic instabilities to macroscopic disorder.
5.3. Transitions and Bifurcations
The transition from ordered to chaotic behavior in dynamical systems is often characterized by bifurcations—critical thresholds where the system’s qualitative dynamics undergo a significant change. Rather than relying solely on rigid analytical protocols, our framework integrates the analysis of spectral evolution with the study of geometric invariants to provide a comprehensive description of the emergence of instabilities.
5.3.1. Bifurcation Detection
The sensitivity of the system’s spectrum to parametric variations is quantified by the following bifurcation indicator:
where
represents a control parameter and
denotes an incremental perturbation. This metric effectively detects bifurcations by identifying abrupt reconfigurations in the system’s spectrum, even in weakly perturbed systems. For example, it can be used to detect Hopf bifurcations in fluid flows.
5.3.2. Transition Analysis
The transition between different dynamical regimes is governed by complex interactions that can be analyzed using three complementary approaches. Local spectral analysis examines how Fourier spectra, or other spectral transforms, evolve as system parameters change. This allows for the detection of gradual transitions and subtle instabilities. For instance, analyzing the emission spectrum of a laser can reveal shifts between different operating modes. By continuously monitoring spectral characteristics such as frequency shifts and amplitude variations, this method helps identify early signs of bifurcations.
Invariant tracking provides a complementary perspective by focusing on conserved quantities, such as energy or angular momentum, to detect critical stability thresholds. In molecular dynamics, for example, monitoring the total energy of a system can indicate phase transitions. When these invariants exhibit abrupt changes or deviate beyond a predefined threshold—using bifurcation theory criteria or stability analysis—they signal the onset of qualitative dynamical changes, often linked to phase transitions or symmetry breaking.
Finally, bifurcation cascade analysis studies sequences of successive instabilities that progressively alter the system’s behavior, potentially leading to turbulence or chaos. This approach combines bifurcation diagrams, Poincaré maps, and fractal dimension measures to capture nonlinear interactions where small perturbations accumulate and trigger large-scale structural changes.
By integrating these three methods—local spectral analysis, invariant tracking, and bifurcation cascade analysis—we gain a comprehensive understanding of how ordered behavior deteriorates, improving our ability to predict and control complex dynamical systems.
6. Numerical Methods
The exploration of complex Hamiltonian dynamics necessitates numerical methods that preserve geometric structure while ensuring computational fidelity. This section details symplectic integration schemes tailored to these requirements, emphasizing their stability, accuracy, and scalability. Conventional numerical integrators, such as explicit Runge-Kutta methods, often introduce artificial dissipation, distorting long-term Hamiltonian evolution. In contrast,
symplectic schemes inherently conserve geometric invariants, making them indispensable for accurate simulations. A paradigmatic example is the
Verlet-Störmer scheme, widely used in celestial mechanics [
10]. This method proceeds through three stages:
where
p and
q denote momentum and position, respectively. Despite its simplicity, this scheme excels in energy conservation over astronomically long simulations and naturally aligns with spectral trajectory analysis [
11]. For enhanced precision in chaotic regimes,
symplectic Runge-Kutta methods (e.g., Gauss-Legendre) solve implicit systems of the form:
where the coefficients
ensure symplecticity [
12]. These methods capture intricate nonlinear behaviors—such as homoclinic tangles in pendulum arrays—but incur higher computational costs due to the implicit equation solving. To validate these methods, we apply them to benchmark problems: the restricted three-body problem (testing quasi-periodic orbit stability) and the nonlinear Schrödinger equation (evaluating norm and spectral conservation). These tests confirm the superior performance of symplectic schemes in preserving both geometric and spectral integrity.
A rigorous evaluation of numerical errors is essential to validate our computational results. Our error analysis focuses on three key aspects: stability, convergence, and computational complexity. While traditional stability analysis relies on the Courant–Friedrichs–Lewy (CFL) condition, symplectic integrators also necessitate an examination of Lyapunov stability [
13]. In chaotic regimes, we numerically compute the spectrum of Lyapunov exponents using Benettin’s method to provide a quantitative measure of stability. The global error of an integrator is a crucial metric. For a method of order
p, the error typically scales as:
We compare different schemes by determining the constant
C in scenarios where analytical solutions are available, thereby ensuring that our numerical methods are both accurate and reliable. We also evaluate the computational cost associated with each method. For example, the Gauss-Legendre schemes require solving implicit systems with a complexity of
, whereas explicit methods like the Verlet-Störmer scheme typically exhibit linear complexity in
s. This balance between precision and efficiency is a central theme in our discussion.
To illustrate the practical effects of numerical errors, we present graphical visualizations of trajectories and Poincaré sections. Specifically, by analyzing the Fourier spectrum of the errors, we can identify the dominant frequencies that characterize perturbations. For example, we integrate a chaotic orbit in a Henon-Heiles potential:
Examining the Fourier transform of the resulting trajectories allows us to quantify the drift introduced by different integration schemes, thus verifying their compatibility with the system’s dynamics.
7. Applications
Beyond theoretical development and numerical experimentation, we illustrate our approach through a variety of case studies that highlight its practical relevance. These applications span diverse fields, including quantum mechanics, signal processing, and machine learning, and they underscore the versatility of our methods.
7.1. Quantum Mechanics
Quantum systems benefit significantly from techniques grounded in Hamiltonian structures and nonlocal transforms. In our work, we focus on two key aspects:
-
Coherent State Simulation and Wave Packet Dynamics: Coherent states, which minimize Heisenberg’s uncertainty, are fundamental in quantum optics and information. Their evolution is governed by the Schrödinger equation:
By employing fast Fourier transform (FFT) methods, particularly through pseudo-spectral schemes [
14], we achieve efficient spectral solutions that better conserve the system’s geometric properties compared to standard time-discretization methods.
Entanglement Analysis via Nonlocal Transforms: Quantum entanglement, a cornerstone in cryptography and quantum teleportation, can be studied using nonlocal transforms. For instance, the Von Neumann entropy,
is evaluated using global transformation bases such as quantum wavelets [
15]. By integrating Fourier analysis with Hamiltonian techniques, our approach yields a more nuanced understanding of quantum correlations.
7.2. Signal Processing
The integration of Hamiltonian principles into signal processing has opened new avenues for innovation in data compression and filtering. Our work focuses on two primary directions, each leveraging the geometric structure of Hamiltonian systems to enhance traditional techniques.
Hamiltonian Filtering and Invariant Preservation: Conventional filters, such as Kalman or Wiener filters, are optimized to minimize mean square error. However, in complex systems—ranging from spectral analysis to speech recognition—preserving the underlying dynamic features is crucial. Hamiltonian-inspired filtering strategies retain key invariants, such as spectral energy and higher-order moments, leading to improved temporal stability and reduced numerical artifacts [
12]. By embedding symplectic structure into the filtering process, these methods ensure that the geometric properties of the signal are preserved, even in nonlinear or non-stationary regimes.
Spectral Compression via Hamilton-Jacobi Structures: While the standard Fourier transform excels at identifying frequency characteristics in stationary signals, non-stationary or multiresolution signals demand more adaptive techniques. By solving the Hamilton-Jacobi equation,
we construct tailored transforms that optimize the separation of frequency components in an extended phase space. This approach is particularly effective in applications such as Doppler radar processing and physiological signal analysis [
16], where the ability to resolve transient features is critical.
7.3. Machine Learning
The fusion of Hamiltonian principles with machine learning represents a rapidly evolving frontier, offering fresh perspectives on optimization, data-driven modeling, and representation learning. Traditional deep learning methods typically operate in Euclidean spaces, often neglecting the inherent geometric constraints of the systems they model. By incorporating symplectic structures and Hamiltonian dynamics, we can enhance the interpretability, efficiency, and stability of learning algorithms. Hamiltonian Neural Networks (HNNs) [
17] offer a paradigm shift in modeling dynamical systems within deep learning frameworks. Rather than approximating trajectories through black-box architectures, HNNs learn an underlying Hamiltonian function
directly from data, ensuring that the system’s evolution adheres to fundamental physical laws. Specifically, the governing equations are:
This approach offers several key advantages: energy conservation (unlike standard recurrent or feedforward networks, HNNs inherently preserve conserved quantities, such as total energy, leading to more physically consistent predictions); generalization to unseen dynamics (by encoding the system’s underlying physics in the learned Hamiltonian, HNNs can extrapolate more accurately to new initial conditions and longer time horizons); and reduced data requirements (the incorporation of prior physical knowledge often allows HNNs to achieve accurate modeling with fewer training examples compared to purely data-driven methods). HNNs have found application in a wide range of disciplines, including molecular dynamics (predicting long-term atomic configurations while preserving physical constraints, such as momentum conservation), robotics and control (learning energy-efficient motion trajectories that respect the constraints of mechanical systems), and astrodynamics (modeling planetary orbits and satellite trajectories with greater accuracy than conventional neural architectures).
Standard optimization algorithms in deep learning, such as Stochastic Gradient Descent (SGD) and Adam, assume that parameter updates follow Euclidean trajectories in weight space. However, many high-dimensional problems exhibit an underlying geometric structure that these methods fail to exploit. A more principled approach leverages symplectic integration techniques within optimization. Specifically, a symplectic gradient update is expressed as:
where
is a skew-symmetric Poisson structure that preserves the symplectic form. This approach offers several advantages: long-term stability (symplectic optimizers mitigate numerical dissipation, leading to more stable convergence properties); enhanced exploration of the loss landscape (by respecting Hamiltonian flows, these methods naturally explore low-energy manifolds in parameter space, reducing the likelihood of becoming trapped in poor local minima); and efficient training for physical systems (when applied to models learning from dynamical data, symplectic optimization ensures that learned representations maintain consistency with the underlying physics). Recent studies have demonstrated the effectiveness of symplectic optimization in diverse applications, including geometric deep learning (incorporating Poisson structures in Graph Neural Networks (GNNs) to enhance feature propagation on manifolds), physics-informed neural networks (PINNs) (improving the stability of training PDE-constrained neural networks using Hamiltonian-inspired optimizers), and reinforcement learning (applying symplectic updates to policy gradient methods for better modeling of continuous control tasks in robotics).
Beyond supervised learning, Hamiltonian methods are increasingly being integrated into generative modeling frameworks. Traditional generative adversarial networks (GANs) and variational autoencoders (VAEs) typically operate in Euclidean latent spaces, often struggling to capture the intricate phase-space structures inherent in physical data. By enforcing a Hamiltonian structure on the latent space, we introduce a more physically meaningful approach to generative modeling. A
Hamiltonian Variational Autoencoder (H-VAE) extends the standard VAE framework by learning a latent space endowed with symplectic structure. Instead of encoding data into arbitrary latent variables, the model ensures that the representations follow Hamiltonian dynamics:
This approach yields several advantages: structured latent representations (unlike conventional VAEs, which often yield entangled feature spaces, Hamiltonian representations naturally separate conserved quantities from transient dynamics); improved sample efficiency (the phase-space constraints enable the model to capture the underlying distribution with fewer training samples); and enhanced interpretability (in scientific applications, latent variables correspond to physically meaningful quantities such as energy, angular momentum, or entropy). Such models have found promising applications in astronomical data generation (simulating galactic dynamics with phase-space constraints to ensure realistic structure formation), molecular design (encoding molecular configurations in a Hamiltonian latent space to improve chemical property prediction), and dynamical system identification (learning generative models that can predict and reconstruct complex dynamical phenomena with greater fidelity than traditional black-box approaches).
The integration of Hamiltonian structures into machine learning is still in its nascent stages, yet it has already demonstrated significant advantages in stability, efficiency, and interpretability. Several exciting research avenues remain open: quantum-inspired machine learning (extending Hamiltonian deep learning to quantum circuits and variational quantum algorithms), manifold-based optimization (developing optimization techniques that explicitly exploit symplectic and Poisson geometries in high-dimensional learning problems), and hybrid models for partial observability (combining Hamiltonian and stochastic representations to model real-world systems where noise and missing data play a role). As computational power and theoretical understanding continue to advance, the synergy between Hamiltonian dynamics and machine learning promises to bridge fundamental mathematics with cutting-edge artificial intelligence, leading to novel applications across physics, engineering, and beyond.
8. Conclusion and Perspectives
This work is part of an ongoing dialogue at the intersection of Hamiltonian dynamics, numerical analysis, and machine learning. The integration of geometric structures into computational frameworks has already demonstrated its effectiveness in improving the stability, interpretability, and accuracy of models across numerous fields. However, several fundamental questions remain open, requiring further investigation.
While symplectic integrators and nonlocal transforms provide powerful tools for the analysis of complex systems, their practical implementation still poses significant computational challenges. Advanced numerical schemes, such as symplectic Runge-Kutta integrators and adaptive Fourier transforms, involve solving implicit equations in high-dimensional phase spaces. This complexity leads to high computational costs for large-scale systems. The integration of optimization strategies, such as parallel computing and reduced-order models, within Hamiltonian frameworks represents a promising avenue to overcome this limitation while ensuring the preservation of geometric invariants.
The experimental validation of machine learning techniques, particularly Hamiltonian neural networks (HNNs) and symplectic optimization, also constitutes a key challenge. Although these approaches possess interesting theoretical properties, their effectiveness on real-world systems, especially in the presence of noise or in complex industrial environments, remains to be demonstrated. Additional efforts are needed to assess their robustness and generalization on diverse datasets.
Another emerging area of research concerns the application of Hamiltonian methods to quantum cryptography. By leveraging symplectic invariants, nonlocal transforms could potentially improve quantum key distribution (QKD). However, their practical implementation still faces several obstacles, notably resistance to decoherence and adversarial attacks in real quantum channels. Combining Hamiltonian algorithms with advanced quantum error correction techniques could open new perspectives for cryptographic security.
Furthermore, many physical and biological systems deviate from purely Hamiltonian dynamics. Adapting methods developed for conservative systems to dissipative contexts is a major challenge. This particularly concerns open quantum systems governed by Lindblad dynamics and classical systems including dissipation effects. Extending Hamiltonian tools to these non-conservative frameworks will require theoretical reformulation and numerical innovations but could lead to significant advances in diverse fields, such as biological modeling and material science.
In the long term, the convergence between physics, mathematics, and artificial intelligence will play a key role in the development of new approaches for the analysis and control of complex dynamical systems. By addressing these challenges, we can fully exploit the potential of methods inspired by Hamiltonian mechanics, thus opening the way for fundamental and applied advances in disciplines as varied as physics, engineering, and machine learning.
References
- Poincaré, H. Les Méthodes Nouvelles de la Mécanique Céleste; Gauthier-Villars, 1892. [Google Scholar]
- Marsden, J.E.; Ratiu, T.S. Introduction to Mechanics and Symmetry; Springer, 1999. [Google Scholar]
- Arnold, V.I. Mathematical Methods of Classical Mechanics; Springer, 1989. [Google Scholar]
- Folland, G.B. A Course in Abstract Harmonic Analysis; CRC Press, 1995. [Google Scholar]
- Daubechies, I. Ten Lectures on Wavelets; SIAM, 1992. [Google Scholar]
- Lichtenberg, A.J.; Lieberman, M.A. Regular and Chaotic Dynamics; Springer, 1992. [Google Scholar]
- Hall, T.; Theodor, A.H. An Analysis of the Limits of Variance-Reduced Methods for Machine Learning. Journal of Machine Learning Research 2015, 16, 345–398. [Google Scholar]
- Moser, J. On invariant curves of area-preserving mappings of an annulus. Nachr. Akad. Wiss. Göttingen Math.-Phys. Kl. II 1962, 1–20. [Google Scholar]
- Takens, F. Detecting strange attractors in turbulence. Lecture Notes in Mathematics 1981, 898, 366–381. [Google Scholar]
- Hairer, C.E.; Lubich, G.W. Geometric Numerical Integration: Structure-Preserving Algorithms for Ordinary Differential Equations; Springer, 2006. [Google Scholar]
- Yoshida, H. Construction of higher order symplectic integrators. Physics Letters A 1990, 150, 262–268. [Google Scholar] [CrossRef]
- Sanz-Serna, J.M.; Calvo, M.P. Numerical Hamiltonian Problems. Acta Numerica 1994, 3, 51–95. [Google Scholar]
- Strelcyn, G.B.L.G.A.G.J. Lyapunov characteristic exponents for smooth dynamical systems and for Hamiltonian systems; a method for computing all of them. Meccanica 1980, 15, 9–20. [Google Scholar]
- Trefethen, L.N. Spectral Methods in MATLAB; SIAM: Philadelphia, 2000. [Google Scholar]
- Nielsen, M.A.; Chuang, I.L. Quantum Computation and Quantum Information; Cambridge University Press: Cambridge, 2000. [Google Scholar]
- Mallat, S. A Wavelet Tour of Signal Processing: The Sparse Way; Academic Press: San Diego, 2009. [Google Scholar]
- Greydanus, S.; Dzamba, M.; Hinton, J.L. Hamiltonian Neural Networks. arXiv 2019, arXiv:1906.01563. [Google Scholar]
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).