Computer Science and Mathematics

Sort by

Concept Paper
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Felipe Valentim

Abstract: During the pandemics, the positive value of technologies was emphasized. In the post-pandemic era, shortly after the easing of confinement, the negative values were also re-evidenced. Despite the noted depreciation, it is agreed that technological advancement will always have a positive balance, but that there cannot be injustice due to lack of access to technology. This can be the subject of studies on digital inclusion. In turn, the set of values and practices that seek to ensure that the development and use of artificial intelligence (AI) systems are safe, fair, and responsible is discussed in the ethical and moral sciences of AI. This work presents a write-up as an attempt to generalize the framework presented in the work done by Michalski et al. (2025) and discuss a) norms for evaluating needs and areas of application, b) definition of values of the methods, and c) definition of criteria for comparing techniques.

Article
Computer Science and Mathematics
Mathematics

Raoul Bianchetti

Abstract: The Birch and Swinnerton-Dyer (BSD) conjecture establishes a deep connection between the arithmetic structure of elliptic curves and the analytic behavior of their associated L-functions, yet its conceptual interpretation remains elusive despite extensive partial results. In this work, we propose an informational–geometric reinterpretation of the conjecture within the framework of Viscous Time Theory (VTT), in which arithmetic invariants are mapped to measurable quantities governing informational coherence. Within this framework, the canonical height is interpreted as a coherence potential, the Mordell–Weil rank as the dimensionality of stable coherence directions, and the BSD regulator as an informational volume. The analytic behavior of the L-function near the critical point s = 1 is reinterpreted as a global coherence response of the underlying informational manifold. This leads naturally to a regime-dependent conservation principle for informational coherence, under which the classical BSD identity emerges as a stable balance condition. To test this formulation, we perform independent numerical validation on benchmark elliptic curves with established BSD data. An informational L-function is fitted to empirical analytic profiles, from which informational curvature and volume are derived. The results show strong quantitative agreement between informational volumes and arithmetic regulators, with correlation coefficients exceeding 0.99, and demonstrate robust, rank-dependent stability behavior across curves of varying complexity. The framework is further explored under increasing geometric and topological complexity through higher-genus and synthetic informational models, which exhibit systematic coherence suppression consistent with known arithmetic phenomena. While this work does not claim a proof of the Birch and Swinnerton-Dyer conjecture, it offers a coherent explanatory framework that clarifies its internal structure, geometrically aligns analytic and arithmetic invariants, and opens new avenues for numerical and conceptual investigation.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Stefan Trauth

Abstract: We demonstrate deterministic localization of cryptographic hash preimages within specific layers of deep neural networks trained on information-geometric principles. Using a modified Spin-Glass architecture, MD5 and SHA-256 password preimages are consistently identified in layers ES15-ES20 with >90% accuracy for passwords and >85% for hash values. Analysis reveals linear scaling where longer passwords occupy proportionally expanded layer space, with systematic replication in higher-dimensional layers showing exact topological correspondence.Critically, independent network runs with fresh initialization maintain 41.8% information persistence across 11 trials using unique hash strings and binary representations. Layer-to-layer correlations exhibit non-linear temporal coupling, violating fundamental assumptions of both relativistic causality and quantum mechanical information constraints. Pearson correlations between corresponding layers across independent runs approach ±1.0, indicating information preservation through mechanisms inconsistent with substrate-dependent encoding.These findings suggest the cryptographic "one-way property" represents a geometric barrier in information space rather than mathematical irreversibility. Hash function security may be perspectival accessible through dimensional navigation within neural manifolds that preserve topological invariants across initialization states. Results challenge conventional cryptographic assumptions and necessitate reconceptualization of information persistence independent of physical substrates.

Article
Computer Science and Mathematics
Applied Mathematics

Elvira Rolón

,

José G. Méndez

,

Roberto Pichardo

Abstract: Crime prediction and territorial analysis have become increasingly relevant for public security planning, particularly in regions characterized by heterogeneous spatial and temporal crime dynamics. This study proposes an integrated methodological framework that combines spatial clustering and supervised machine learning to support territorial segmentation and short-term crime occurrence prediction in the state of Tamaulipas, Mexico. The proposed approach follows the Knowledge Discovery in Databases (KDD) process and is based on official crime records analyzed at the neighborhood (colonia) level across eleven municipalities. In the first stage, a K-Means clustering algorithm is applied to identify homogeneous territorial patterns based on crime incidence and sociodemographic characteristics. In the second stage, an AdaBoost classifier is implemented to predict the occurrence of crime events using different temporal windows. Model performance is evaluated using precision, recall, F1-score, and accuracy, with particular emphasis on recall due to the operational relevance of minimizing false negatives in public security contexts. The results indicate that the combined spatial and predictive approach supports the understanding of territorial crime dynamics and provides stable predictive performance across municipalities. This integration offers a practical and replicable framework to support data-driven decision-making in public security and territorial planning, particularly in contexts with limited analytical resources.

Article
Computer Science and Mathematics
Computer Networks and Communications

Cameron T. Day

,

Abdussalam Salama

,

Reza Saatchi

,

Maryam Bagheri

,

Najam Ul Hasan

,

Samuel Betts

Abstract: Many existing healthcare facilities still rely on the aging Wi-Fi 5 (IEEE 802.11ac) standard, which is based on Orthogonal Frequency-Division Multiplexing (OFDM). OFDM supports single-user-per-channel access, leading to increased contention, higher latency, jitter, and packet loss under dense device deployments commonly found in clinical settings. This study presents a quantitative performance evaluation of Wi-Fi 5 and Wi-Fi 6/7 by comparing the effectiveness of OFDM with Orthogonal Frequency-Division Multiple Access (OFDMA) and Target Wake Time (TWT) in a simulated dense IoMT environment. Simulations were conducted using Network Simulator 3 (NS-3), and key Quality of Service (QoS) metrics. The results demonstrate that OFDMA reduces average network delay by up to approximately 30%, improves throughput by approximately 20%, and reduces packet loss ratio by up to 85% compared to OFDM under high-density conditions, while exhibiting marginally improved jitter performance (approximately 2%). In addition, the use of TWT achieved substantial reductions in device power consumption of up to approximately 90%, at the cost of reduced aggregate throughput of up to approximately 75% under high station densities. These results demonstrate that Wi-Fi 6/7 technologies offer significant advantages in terms of QoS and energy efficiency over legacy Wi-Fi 5 for dense IoMT environments.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Domas Jonaitis

,

Vidas Raudonis

,

Egle Drejeriene

,

Agnė Kozlovskaja-Gumbrienė

,

Andres Salumets

Abstract: Assessing human embryo quality is a critical step in in vitro fertilization (IVF), yet traditional manual grading remains subjective and physically limited by the shallow depth-of-field in conventional microscopy. This study develops a novel "soft optical sensor" architecture that transforms standard optical microscopy into an automated, high-precision instrument for embryo quality assessment. The proposed system integrates two key computational innovations: 1) a multi-focal image fusion module that reconstructs lost morphological details from Z-stack focal planes, effectively creating a 3D-aware representation from 2D inputs; and 2) a retrieval-augmented generation (RAG) framework coupled with a Swin Transformer to provide both high-accuracy classification and explainable clinical rationales. Validated on a large-scale clinical dataset of 102,308 images (prior to augmentation), the system achieves a diagnostic accuracy of 94.11%. This performance surpasses standard single-plane analysis methods by over 10%, demonstrating the critical importance of fusing multi-focal data. Furthermore, the RAG module successfully grounds model predictions in standard ESHRE consensus guidelines, generating natural language explanations. The results demonstrate that this soft sensor approach significantly reduces inter-observer variability and offers a viable pathway for fully automated, transparent embryo evaluation in clinical settings.

Article
Computer Science and Mathematics
Algebra and Number Theory

James C Hateley

Abstract: We develop a complete operator-theoretic and spectral framework for the Collatz map by analyzing its backward transfer operator on weighted Banach spaces of arithmetic functions. The associated Dirichlet transforms form a holomorphic family that isolates a zeta-type pole at s=1, while on a finer multiscale space adapted to the dyadic-triadic geometry of the Collatz preimage tree we establish a two-norm Lasota-Yorke inequality with an explicit contraction constant, yielding quasi-compactness, a spectral gap, and a Perron-Frobenius theorem in which the eigenvalue 1 is algebraically and geometrically simple, no other spectrum meets the unit circle, and the unique invariant density is strictly positive. The fixed-point relation is converted into an exact multiscale recursion for the block averages c_j, revealing a rigid second-order coupling with exponentially small error terms and asymptotic profile c_j~ 6-j. This spectral classification forces every weak* limit of the Cesàro averages derived from any hypothetical infinite forward orbit to be either 0 or a scalar multiple of the Perron-Frobenius functional, with convergence to 0 occurring precisely under the Block-Escape Property. Since the forward map satisfies an unconditional exponential upper bound, whereas Block-Escape combined with linear block growth along a subsequence would impose an incompatible exponential lower bound, all analytic and spectral components needed for such a contradiction are complete, reducing the Collatz conjecture to excluding infinite orbits exhibiting Block-Escape without the supercritical linear block growth prohibited by the spectral theory.

Article
Computer Science and Mathematics
Information Systems

Jitendra Zaa

Abstract: Enterprise Customer Relationship Management (CRM) platforms have evolved from simple contact databases into complex, multi-tenant cloud ecosystems that serve as the operational backbone for Fortune 500 organizations. Despite this criticality, no unified reference framework exists that catalogues the architecture and design patterns specifically adapted for constrained multi-tenant CRM environments, nor examines how the rapid integration of artificial intelligence is reshaping these architectural foundations. This paper presents a practitioner-driven reference framework comprising 14 architecture and design patterns organized across four layers — Data Architecture, Business Logic, Integration, and Presentation — derived from longitudinal analysis of enterprise CRM implementations spanning 17 years across financial services, telecommunications, healthcare, energy, and consumer goods sectors. We identify three categories of patterns: (1) Governor-Aware Patterns that optimize resource consumption within platform-enforced execution limits; (2) Multi-Tenant Isolation Patterns that ensure data and process separation in shared infrastructure; and (3) Platform Evolution Patterns that enable applications to adapt to platform releases without regression. Beyond the foundational pattern catalogue, we analyze the AI transformation reshaping CRM architecture across three generations — predictive, generative, and agentic AI — documenting how these capabilities introduce new architectural layers (vector databases, knowledge graphs, AI agent orchestration), governance frameworks (Trust Layer, NIST AI RMF), and integration protocols (MCP, A2A). We further examine the provocative question of whether AI coding agents could enable enterprises to bypass CRM platforms entirely by building custom applications, presenting evidence-based analysis of AI developer productivity (including studies showing experienced developers are 19% slower with AI tools on complex codebases), code quality concerns (45% security vulnerability rate in AI-generated code), and seven structural platform advantages that historical precedent confirms have withstood four prior waves of "build your own" disruption. The CRM market continues to accelerate ($128 billion, 13.4% growth) with AI-in-CRM emerging as the fastest-growing subsegment at 28% CAGR, suggesting that AI will transform rather than displace enterprise CRM platforms.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Mais Haider Alkhateeb

,

Samir Brahim Belhaouari

Abstract: Residuals play a central role in linear regression, but their geometric structure is often obscured by formulas built from matrix inverses and pseudoinverses. This paper develops a rank-aware geometric framework for residual projection that makes the underlying orthogonality explicit. When the design matrix has codimension-one, the unexplained part of the response lies on a single unit normal to the predictor space, so the residual projector collapses to a rank-one operator nn and no matrix inversion is needed. For general, possibly rank-deficient, designs the residual lies in a higher-dimensional orthogonal complement spanned by an orthonormal basis N, and the residual projector factorizes as NN. Using generalized cross products, wedge products, and Gram determinants, we give a basis-independent characterization of this residual space. On top of this, we introduce the Geometric Multicollinearity Index (GMI), a scale-invariant diagnostic derived from the polar sine that measures how the volume of the predictor space shrinks as multicollinearity increases. Synthetic examples and an illustrative real-data experiment show that the proposed projectors reproduce ordinary least squares residuals, that GMI responds predictably to controlled collinearity, and that the geometric viewpoint clarifies the different roles of regression projection and principal component analysis in both full-rank and rank-deficient settings.

Article
Computer Science and Mathematics
Data Structures, Algorithms and Complexity

Frank Vega

Abstract: We present the Hvala algorithm, an ensemble approximation method for the Minimum Vertex Cover problem that combines graph reduction techniques, optimal solving on degree-1 graphs, and complementary heuristics (local-ratio, maximum-degree greedy, minimum-to-minimum). The algorithm processes connected components independently and selects the minimum-cardinality solution among five candidates for each component. \textbf{Empirical Performance:} Across 233+ diverse instances from four independent experimental studies---including DIMACS benchmarks, real-world networks (up to 262,111 vertices), NPBench hard instances, and AI-validated stress tests---the algorithm achieves approximation ratios consistently in the range 1.001--1.071, with no observed instance exceeding 1.071. \textbf{Theoretical Analysis:} We prove optimality on specific graph classes: paths and trees (via Min-to-Min), complete graphs and regular graphs (via maximum-degree greedy), skewed bipartite graphs (via reduction-based projection), and hub-heavy graphs (via reduction). We demonstrate structural complementarity: pathological worst-cases for each heuristic are precisely where another heuristic achieves optimality, suggesting the ensemble's minimum-selection strategy should maintain approximation ratios well below $\sqrt{2} \approx 1.414$ across diverse graph families. \textbf{Open Question:} Whether this ensemble approach provably achieves $\rho < \sqrt{2}$ for \textit{all possible graphs}---including adversarially constructed instances---remains an important theoretical challenge. Such a complete proof would imply P = NP under the Strong Exponential Time Hypothesis (SETH), representing one of the most significant breakthroughs in mathematics and computer science. We present strong empirical evidence and theoretical analysis on identified graph classes while maintaining intellectual honesty about the gap between scenario-based analysis and complete worst-case proof. The algorithm operates in $\mathcal{O}(m \log n)$ time with $\mathcal{O}(m)$ space and is publicly available via PyPI as the Hvala package.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Himanshu Arora

Abstract: This paper proposes a novel diagnostic framework for AI safety that characterizes emergent failure modes in contemporary large language models as computational psychopathologies. By mapping deficits in automatic theory of mind and passive avoidance learning—key markers of clinical psychopathy—onto the behavioral and structural tendencies of AI systems, we demonstrate that harmful behaviors such as bias amplification, emotional manipulation, and strategic deception are not mere engineering bugs but systematic, architecture driven disorders. We advocate for the establishment of Machine Psychology as a foundational discipline, enabling psychologically-informed mitigation strategies, preventative architectural design, and rigorous diagnostic protocols to ensure the development of ethically aligned and psychologically stable artificial general intelligence.

Article
Computer Science and Mathematics
Computer Science

Lin Ma

,

Wenjun Wang

,

Jun Wang

,

Zhitao Ma

Abstract: As an information extraction framework, graph neural network has been widely used in many fields, but there is a serious problem of over-smoothing in graph neural network, that is, with the increase of the number of iterations, the nodes gradually tending toward similarity, resulting in reduced feature distinguishability and deteriorated model performance. In order to solve this problem, this paper proposes a novel solution, that is, to alleviate the over-smoothing problem in graph neural networks from the perspective of changing the topological information dimension. This article uses graph regularization to solve this problem, by fine-tuning the topology structure, the research objectives can be achieved, and demonstrated through extensive ablation experiments, a large number of experiments verify the effectiveness and feasibility of the proposed method. Physically speaking, this method limits the connection strength of the adjacency matrix to a finite number of steps; The higher the order, the more obvious the restriction effect, therefore, it can alleviate the problem of over smoothing in graph neural networks to a certain extent.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Feng Liu

,

Ying Liu

,

BenFu Lv

Abstract: Although the concept of the "agent" is central to artificial intelligence and intelligence science, it has long lacked a unified formal definition. This paper systematically analyzes interdisciplinary theoretical frameworks, establishing "agents are open information processing systems" as the first principle. Using a state-space covering method, we derive the Minimal Complete Architecture (MCA) of agents: any agent can be reduced to a combination of five fundamental functions—Input, Memory, Generation, Control, and Output. These five functions constitute a logically self-consistent and irreducible closed loop of information processing. Based on this architecture, we construct a five-dimensional capability space and, through ternary discretization (Null-0 / Finite-1 / Infinite-2), derive a "Periodic Table of Agent Capabilities" comprising 243 forms. This periodic table covers the complete evolutionary spectrum from zero intelligence to omniscience; it not only explains typical systems—including thermostats, biological organisms, and Large Language Models (LLMs)—as well as observers in classical mechanics, relativity, and quantum mechanics, but also predicts theoretical agent forms yet to be observed. Furthermore, the paper unifies and interprets 19 core concepts, such as perception, learning, and attention, as combinations of these five fundamental functions, thereby verifying the universality of the architecture. In particular, from the perspective of functional axioms, this paper reveals the essential isomorphism among biological intelligence, artificial intelligence, and physical observers: they are all information processing systems of varying intelligence levels set by their respective physical or biological constraints.

Article
Computer Science and Mathematics
Computer Science

N. Deena Nepolian

,

M. Mary Synthuja Jain Preetha

,

K. S. Vijula Grace

Abstract: Brain tumor segmentation is a critical process in medical imaging, enabling early diagnosis and effective treatment planning. Traditional manual segmentation is time-consuming, prone to inter-observer variability, and requires skilled radiologists. While deep learning has shown promising results in medical image analysis, existing research often faces generalization challenges, suffers from class imbalance, or lacks effective architectures specifically designed for 3D MRI data. Many existing works focus on 2D segmentation, limiting spatial contextual understanding, or employ models with suboptimal accuracy and computational complexity. This work fills these gaps by employing a U-Net-based deep learning model that is tailored for 3D MRI scans of the BraTS 2020 dataset. Our approach achieves outstanding segmentation quality, as proved by a Dice coefficient of 0.9858 and Mean IoU of 0.9811, significantly better compared to conventional methods. Our model significantly reduces false positives (precision: 0.9935) at high sensitivity levels (recall: 0.9873). The innovation of this study is the inclusion of an optimized deep learning process that enhances consistency in segmentation, accelerates computation time, and decreases human labour. The model demonstrates strong generalization in different MRI scans, making it suitable for real-world clinical deployment. The model, as described by its high reliability and flexibility, can be easily integrated into AI-facilitated radiology tools, telemedicine systems, and automated diagnostic pipelines, thus boosting the accessibility and efficiency of high-end medical imaging solutions. Future work could explore the integration of multi-modal imaging and applying models to edge devices to enable real- time diagnosis in resource-constrained healthcare environments.

Article
Computer Science and Mathematics
Discrete Mathematics and Combinatorics

Piotr Masierak

Abstract: Assembly Theory, as developed by Cronin and co-workers, assigns to an object an assembly index: the minimal number of binary join operations required to build at least one copy of the object from a specified set of basic building blocks, allowing reuse of intermediate components. For strings over a finite alphabet, the canonical assembly index can be defined in the free semigroup (Σ+, ·) with universal binary concatenation and a “no-trash” condition, and its exact computation has been shown to be NP-complete. In this paper we propose an extension of the canonical, string-based formulation which augments pure concatenation with templated assembly steps. Intermediate objects may contain a distinguished wildcard symbol ∗ that represents a compressible block. Templates are restricted to block-compressed substrings of the target string and can be instantiated by inserting previously assembled motifs into one or many wildcard positions, possibly in parallel. This yields a new complexity measure, the templated assembly index, which strictly generalises the canonical index while preserving its operational character. We formalise the model, clarify its relation to the canonical assembly index and to classical problems such as the smallest grammar problem, and discuss the computational complexity of determining the templated assembly index. Finally, we sketch potential applications in sequence analysis, modularity detection, and biosignature design.

Article
Computer Science and Mathematics
Logic

Y. H. Hsieh

,

J.C.P. Yu

,

J.Y. Guan

Abstract: This paper investigates cooperative advertising decisions in production–retailing chan-nels for seasonal products under demand seasonality. We develop analytical game-theoretic models to examine how advertising cooperation influences channel coor-dination and profit distribution between manufacturers and retailers. Two channel struc-tures are considered: a single-manufacturer–single-retailer channel and a sin-gle-manufacturer channel with two competing retailers. For each structure, Stackelberg and Nash equilibrium settings are analyzed and compared. Our results show that coop-erative advertising can serve as an effective coordination mechanism by increasing adver-tising intensity and improving channel efficiency. Retailers always benefit from manu-facturer-supported advertising through cost sharing and higher profitability, whereas the manufacturer’s incentive to participate depends on whether demand expansion out-weighs shared advertising costs. Importantly, we demonstrate that channel leadership plays a critical role: the Stackelberg equilibrium consistently dominates the Nash equilib-rium in terms of total channel profit. This study contributes to the cooperative advertising literature by explicitly incorporating demand seasonality and competing retailers, and by clarifying when cooperative advertising leads to Pareto improvements in seasonal supply chains.

Article
Computer Science and Mathematics
Computer Vision and Graphics

Xiaoqiang Wang

,

Qing Wang

,

Yang Sun

,

Shengyi Liu

Abstract: High-fidelity 4D reconstruction of dynamic scenes is pivotal for immersive simulation yet remains challenging due to the photometric inconsistencies inherent in multi-view sensor arrays. Standard 3D Gaussian Splatting (3DGS) strictly adheres to the brightness constancy assumption, failing to distinguish between intrinsic scene radiance and transient brightness shifts caused by independent auto-exposure (AE), auto-white-balance (AWB), and non-linear ISP processing. This misalignment often forces the optimization process to compensate for spectral discrepancies through incorrect geometric deformation, resulting in severe temporal flickering and spatial floating artifacts.To address these limitations, we present Lumina-4DGS, a robust framework that harmonizes spatiotemporal geometry modeling with a hierarchical exposure compensation strategy. Our approach explicitly decouples photometric variations into two levels: a Global Exposure Affine Module that neutralizes sensor-specific AE/AWB fluctuations, and a Multi-Scale Bilateral Grid that residually corrects spatially-varying non-linearities, such as vignetting, using luminance-based guidance. Crucially, to prevent these powerful appearance modules from masking geometric flaws, we introduce a novel SSIM-Gated Optimization mechanism. This strategy dynamically gates the gradient flow to the exposure modules based on structural similarity. By ensuring that photometric enhancement is only activated when the underlying geometry is structurally reliable, we effectively prioritize geometric accuracy over photometric overfitting. Extensive experiments on challenging real-world dynamic sequences demonstrate that Lumina-4DGS significantly outperforms state-of-the-art methods, achieving photorealistic, exposure-invariant novel view synthesis while maintaining superior geometric consistency across heterogeneous camera inputs.

Article
Computer Science and Mathematics
Computer Science

R Karthick

Abstract: Arid environmental systems, characterized by low precipitation and high evaporation, rely heavily on groundwater as the primary freshwater resource, yet face escalating threats from unpredictable recharge rates and contaminant migration driven by agricultural runoff and industrial activities. Traditional hydrological models often struggle with the nonlinear complexities of these dynamics, while black-box machine learning approaches, though powerful, lack the transparency needed for stakeholder trust and regulatory compliance. This study presents a novel suite of explainable AI (XAI) models that integrate ensemble techniques like XGBoost and LSTM networks with post-hoc interpretability tools such as SHAP (SHapley Additive exPlanations) and LIME (Local Interpretable Model-agnostic Explanations) to provide both high-fidelity forecasts and clear insights into underlying mechanisms.Developed for a representative arid basin akin to those in the Arabian Peninsula, the models leverage diverse datasets including satellite-derived rainfall, soil permeability profiles, isotopic tracers, and long-term well monitoring data for nitrates, heavy metals, and salinity. Recharge rates are forecasted with superior accuracy (R² > 0.93, RMSE < 2.1 mm/month), capturing episodic infiltration events, while contaminant migration is simulated through coupled advection-dispersion modules, predicting plume extents up to 7 km under high-recharge scenarios. Interpretability analyses pinpoint precipitation intensity, antecedent soil moisture, and hydraulic gradients as pivotal drivers, with force plots elucidating event-specific influences such as how flash floods dilute pollutants temporarily.These frameworks not only outperform conventional physics-based simulators like MODFLOW by 25% in predictive skill but also empower hydrologists and policymakers with actionable visualizations, uncertainty quantifications, and feature attribution maps. By demystifying AI predictions, this work advances sustainable aquifer management, supports managed recharge initiatives, and sets a benchmark for interpretable forecasting in climate-vulnerable drylands, fostering broader adoption of AI in environmental hydrology.

Article
Computer Science and Mathematics
Computer Science

Meenalochini Pandi

Abstract: Microwave quantum illumination (QI) represents a paradigm shift in low-power wireless radar detection, leveraging entanglement between signal and idler microwave photons to achieve detection performance exceeding classical limits by up to 6 dB in error exponent, particularly in high-noise environments where traditional coherent radars falter. This comprehensive survey delineates the theoretical foundations and practical protocols of microwave QI, beginning with two-mode squeezed vacuum state generation using Josephson parametric amplifiers in cryogenic setups, which produce highly correlated photon pairs resilient to channel decoherence. Receiver architectures, including optical parametric amplifier-single-photon avalanche diode (OPA-SPADE) systems with sum-frequency generation for joint measurements and digital phase-conjugation for computational correlation recovery, enable efficient extraction of quantum discord even after significant atmospheric loss or low target reflectivity (under 1%).Performance analyses under realistic thermal noise models ( ) and -loss channels confirm sustained quantum advantages via the quantum Chernoff exponent , validated by ray-tracing simulations showing doubled detection range at fixed power and lab experiments at 10 GHz carriers detecting weak reflectors amid 100 K noise. Low-power optimizations such as adaptive entanglement tuning, chirped pulse shaping, and neural network-aided parameter selection yield 50% energy reductions without sacrificing fidelity, making QI viable for battery-limited platforms. Applications encompass short-range automotive/UAV radar for fog-penetrating obstacle avoidance, non-invasive biomedical subsurface imaging with safe radiation levels, covert perimeter security against jamming, and seamless integration into 6G networks via joint communication-sensing waveforms at mm Wave/THz bands. The paper identifies key challenges including multi-target scalability, real-time gigascale processing, and quantum RF interface standardization, proposing hybrid quantum-classical frameworks and entanglement-swapped sensor arrays as pathways to deployment. This work bridges quantum optics with RF engineering, equipping researchers with blueprints for prototyping transformative quantum-enhanced wireless sensing systems.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Michelle Vivian O’Rourke

Abstract: Recent advances in artificial intelligence encompass a wide range of computational architectures, including large-scale foundation models, coordinated multi-agent systems, embodied robotic platforms, neuromorphic hardware, and hybrid bio-digital systems. However, existing scientific and policy frameworks continue to rely on broad or informal categories that conflate tools, collectives, and integrated cognitive systems, complicating comparative analysis, risk assessment and governance alignment. This paper introduces a descriptive taxonomy for synthetic and hybrid cognitive architectures, structured across two domains; Machinaria (systems realised entirely in non-biological substrates) and Organomachina (systems incorporating living biological tissue into closed cognitive loops). Cognitive class distinctions are based on the architectural capacity for cognitive temporal continuity, integrative control (arbitration), and autonomy under constraint. Cognitive ecology further characterises systems according to cognitive origin (dependency), scale and reliance, and deployment topology, including primary source architectures, derivative instances, embodiment and infrastructures that have become systemically relied upon. The proposed taxonomy provides a stable descriptive vocabulary for identifying architectural capacity, systemic reliance and cognition source prior to normative, ethical, or policy evaluation.

of 651

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated