Computer Science and Mathematics

Sort by

Article
Computer Science and Mathematics
Computer Science

Yuxia Qian

,

Yiwen Liang

,

Lei Shang

,

Xinqi Dong

,

Yincheng Liang

Abstract: Network access control and identity legitimacy verification have been implemented by establishing a secure foundation for the trusted establishment of communication entities. However, successful identity authentication alone does not guarantee secure communication. In open-network environments, it remains essential to establish a secure session key via a robust key agreement mechanism—one that prevents explicit disclosure of identity information while ensuring post-quantum security. To address these requirements, we propose a lattice-based key agreement protocol. The protocol integrates identity binding, implicit authentication, and session key establishment into a single ciphertext exchange. Furthermore, it supports secure key evolution and revocation verification through a version-control mechanism and a blockchain-maintained revocation list—thus realizing a comprehensive, post-quantum-secure key agreement scheme under reasonable computational and communication overhead.

Article
Computer Science and Mathematics
Computer Science

Songtao Hu

,

Liang Chen

,

Qianyue Zhang

,

Wenchao Liu

Abstract: The Automatic Identification System (AIS) generates massive volumes of real-world ship trajectory data, providing a critical foundation for maritime ship type classification. However, existing methods often struggle to simultaneously capture long-range temporal dependencies, maintain computational efficiency, and ensure model interpretability, which makes accurate multi-class classification challenging in real-world maritime environments. To address these limitations, this study proposes a robust and efficient hybrid framework. The proposed architecture integrates a Feature Transformer module for deep temporal feature extraction with a LightGBM model for efficient ensemble classification. Specifically, the multi-head self-attention mechanism within the Feature Transformer captures long-range dependencies in preprocessed AIS sequences to generate compact trajectory fingerprints. These deep temporal representations are then concatenated with carefully designed statistical and kinematic tabular features and fed into the LightGBM classifier for final ship type identification. To validate the proposed framework, we construct a comprehensive real-world AIS dataset consisting of 2,196 trajectories collected between 2019 and 2023, encompassing diverse ship types that reflect authentic maritime scenarios. Experimental results show that the proposed method achieves 82.42% overall accuracy and 77.35% Macro-F1, significantly outperforming comparative baseline models, including LSTM (64.85% accuracy), GRU (64.85%), vanilla Transformer (61.21%), and standalone LightGBM (59.09%). Furthermore, the hybrid model offers ultra-fast inference (1.58 ms per batch) and enhanced interpretability through SHAP-based analysis, making it highly suitable for near real-time maritime traffic monitoring and decision-support applications.

Article
Computer Science and Mathematics
Computer Science

Yanyan Jia

,

Siyi Wang

Abstract: Traffic sign detection in autonomous driving faces challenges including multi-scale objects, complex backgrounds, and limited edge-computing power. To address insufficient multi-scale feature representation and high false negatives for small traffic signs in YOLOv8n, this study proposes an improved algorithm integrating the VoVGSCSP module with a Multi-scale Contextual Attention (MCA) mechanism. The original C2f module is replaced with VoVGSCSP, enhancing feature representation through parallel residual branches and cross-stage connections. A lightweight neck, SlimNeck, is designed and combined with MCA, employing multi-branch pooling and dynamic weight fusion to capture geometric features and color semantics. The PAN-FPN path is optimized with cross-level connections and learnable weights for adaptive multi-scale fusion. Experiments on the GTSRB dataset show that the improved model reduces parameters to 2.66 M (an 11.6% decrease) and computational complexity to 7.49 GFLOPs, while mAP@0.5 increases from 94.7% to 96.3% and FPS improves from 82.3 to 90.6. The proposed algorithm achieves comprehensive gains in lightweighting, accuracy, and speed, demonstrating its effectiveness and practical applicability.

Article
Computer Science and Mathematics
Computer Science

Chang Chia Wei

Abstract: In the current internet era, the number of information security vulnerabilities has increased dramatically. Image and text encryption have become critical preprocessing steps in secure information transmission. Sensitive information can be transmitted through encrypted images, facilitating the implementation of various secure communication systems. This paper proposes an image encryption scheme that employs Sudoku as a cryptographic key matrix, combined with a strong diffusion mechanism to enhance pixel confusion and diffusion effects. The proposed method achieves high-level pixel scrambling through multiple rounds of iterative threshold encryption, pixel padding with random shuffling, and Sudoku-based permutation. Additionally, rotation operations are applied to further increase the irreversibility of the encrypted image. The core keys include the iterative threshold sequence, row-column diffusion keys, and random premutation parameters, ensuring that the encryption is fully reproducible. Experimental results demonstrate that, while preserving reversibility, the proposed method achieves significant confusion and diffusion performance. For the Lena image, the method attains NPCR ≈ 99.22% and UACI ≈ 33.30%, indicating its effectiveness as a robust image encryption approach.

Article
Computer Science and Mathematics
Computer Science

Ganglong Duan

,

Haonan Sun

,

Sijia Zhong

,

Hongquan Xue

Abstract: In precision mold manufacturing, the machining of HRC52 hardened steel causes se-vere tool wear and high noise in multi source sensor signals, making accurate remain-ing useful life (RUL) prediction challenging. To address this, we propose a hybrid mod-el that integrates one dimensional deep convolution (DCNN), low resolution self attention (LRSA) with 1D 2D spatiotemporal reconstruction, and a multi channel bidirectional long short term memory network (McBiLSTM). A Gaussian smoothing filter is first applied to denoise the 50 kHz signals, followed by physical period sliding windows for feature extraction. A multi strategy fusion pooling layer (mean, max, and last quarter features) further improves prediction accuracy. Using the PHM 2010 milling cutter dataset under leave one out cross validation, the proposed model achieves a mean absolute percentage error (MAPE) of 1.45% and a root mean square error (RMSE) of 2.76 mm, reducing prediction error by up to 75.6% compared to Transformer, LSTM, and GRU baselines. These results demonstrate that the model ef-fectively extracts degradation features even during the accelerated wear stage, offer-ing a reliable solution for tool health monitoring and predictive maintenance under complex cutting conditions.

Article
Computer Science and Mathematics
Computer Science

R. Senthilkumar

Abstract: Soft robotic grippers excel in unstructured manipulation but suffer catastrophic failure rates (72%) when grasping deformable organics, fabrics, and mixed debris due to hyperchaotic pneumatic dynamics. This paper introduces the first Lyapunov stability controller for soft robotics, deploying real-time maximal Lyapunov exponent estimation (λ_MLE) from fibre-optic strain sensor arrays running at 100Hz on Intel Loihi 2 neuromorphic chips. The system reconstructs 12D phase space embeddings via Takens theorem, detecting chaos onset 187ms early during dual-material transitions (tomato → bolt), enabling pre-emptive damping that transforms strange attractors into stable limit cycles. Experimental validation across USDA organic datasets (tomatoes, grapes, leafy greens) and MRF waste streams demonstrates 94.2% grasp success 3.7× improvement over PID baselines with 2.3× faster cycles (2.1 grips/second) and 67% energy savings. Neuromorphic acceleration achieves 187μs latency for 12D divergence computation, 28× faster than GPU methods. Field deployments confirm robustness, agricultural harvesting sustains 3 clusters/minute, waste sorting handles mixed-material chaos, and medical tissue manipulation achieves sub-micron precision under arterial pulpability. Theoretical contributions include event-triggered Lyapunov redesign guaranteeing exponential stability (λ_1<-0.1) despite 24dB vibration and 47% moisture variance. Phase space visualization reveals Kaplan-Yorke dimension collapsing from 8.2D hyper chaos to 2.1D stable manifolds, providing online stability margins. This work establishes chaos quantification as a foundational primitive for next-generation soft robotics, transforming nonlinearity from failure mode to control parameter across agriculture, recycling, and minimally-invasive surgery.

Article
Computer Science and Mathematics
Computer Science

Taehyun Yang

,

Eunhye Kim

,

Zhongzheng Xu

,

Fumeng Yang

Abstract: Generative AI tools have lowered barriers to producing branded social media images and captions, yet small-business owners (SBOs) still struggle to create on-brand posts without access to professional designers or marketing consultants. Although these tools enable fast image generation from text prompts, aligning outputs with a brand’s intended look and feel remains a demanding, iterative creative task. In this position paper, we explore how SBOs navigate iterative content creation and how AI-assisted systems can support SBOs’ content creation workflow. We conducted a preliminary study with 12 SBOs who independently manage their businesses and social media presence, using a questionnaire to collect their branding practices, content workflows, and use of generative AI alongside conventional design tools. We identified three recurring challenges: (1) translating brand “feel” into effective prompts, (2) difficulty revisiting and comparing prior image generations, and (3) difficulty making sense of changes between iterations to steer refinement. Based on these findings, we present a prototype that scaffolds brand articulation, supports feedback-informed exploration, and maintains a traceboard of branching image iterations. Our work illustrates how traces of the iterative process can serve as workflow support that helps SBOs keep track of explorations, make sense of changes, and refine content. CCS Concepts: Human-centered computing → Human computer interaction (HCI).

Article
Computer Science and Mathematics
Computer Science

V. Thamilarasi

Abstract: The convergence of Neuro-Symbolic AI, Edge Computing, and Reinforcement Learning heralds a transformative era in autonomous engineering design, addressing longstanding challenges in optimization efficiency, real-time responsiveness, and interpretability. Traditional design workflows suffer from siloed neural pattern recognition lacking logical rigor, centralized cloud dependencies creating latency bottlenecks, and heuristic optimization struggling with multi-objective trade-offs in vast design spaces. This paper introduces an integrated framework that synergistically combines these paradigms to create self-sustaining, end-to-end autonomous pipelines for complex engineering applications from aerospace structures to precision manufacturing.Neuro-Symbolic AI fuses deep neural networks for perceptual feature extraction with symbolic reasoning engines enforcing hard constraints and generating auditable proofs, enabling systems that both discover novel configurations and validate them against domain physics. Edge Computing decentralizes inference across device-fog-cloud hierarchies, achieving sub-10ms decision cycles critical for real-time applications like robotic assembly or smart grid stability. Reinforcement Learning optimization engines navigate continuous state-action spaces representing design variables, iteratively refining solutions through shaped rewards aligned with Pareto-optimal engineering objectives such as minimizing mass while maximizing strength-to-weight ratios.The proposed architecture orchestrates these components via directed acyclic graphs of containerized microservices, with federated synchronization ensuring data consistency across distributed nodes and human-in-the-loop interfaces providing strategic oversight for safety-critical decisions. Mathematical formulations ground the system hybrid loss functions balance learning objectives, edge partitioning optimizes, and multi-agent RL decomposes collaborative design tasks.Deployed on resource-constrained edge platforms, this framework demonstrates 8-12× acceleration in design cycle times, 25-35% improvements in structural efficiency, and full traceability satisfying aerospace certification standards (DO-178C). By eliminating manual iteration bottlenecks while preserving human insight where needed, the system redefines engineering practice, enabling rapid innovation across domains requiring concurrent optimization of performance, manufacturability, sustainability, and cost.

Article
Computer Science and Mathematics
Computer Science

P. Selvaprasanth

Abstract: Distributed modern software platforms spanning microservices, serverless functions, and edge computing face unprecedented security threats from stealthy adversaries exploiting encrypted data flows and behavioural camouflage. Conventional defences require decryption for analysis, exposing sensitive information in untrusted cloud environments. This paper proposes an innovative framework integrating homomorphic encryption (HE) with automated threat hunting to enable privacy-preserving threat detection at scale. Using levelled BFV schemes from OpenFHE, we perform computations directly on ciphertexts for anomaly scoring and behavioural profiling, while our hunting engine employs graph neural networks and isolation forests to hypothesize and pursue attacker patterns across distributed logs without plaintext exposure.The architecture deploys as Kubernetes-native operators, processing 10,000 encrypted events per second with 92% detection accuracy on MITRE-emulated scenarios, outperforming traditional UEBA by 35% in F1 score and reducing analysis latency from hours to seconds. Evaluations on AWS EKS clusters demonstrate sub-200ms query times for homomorphic aggregations, with noise management via bootstrapping optimizations. Case studies in fintech pipelines reveal thwarted supply-chain compromises and insider data exfiltration’s. By revolutionizing secure computation in dynamic ecosystems, our solution bridges cryptography and AI-driven hunting, offering deployable resilience against evolving threats while complying with GDPR and zero-trust mandates. Future work extends to fully homomorphic deep learning for adaptive adversary modelling.

Review
Computer Science and Mathematics
Computer Science

Divyasree Bellary

Abstract: Decentralized applications (DApps) represent a paradigm shift in software architecture, leveraging blockchain technology and distributed consensus mechanisms to eliminate single points of failure and centralized control. As the adoption of DApps accelerates across sectors such as finance, supply chain, healthcare, and governance, ensuring their functional correctness and behavioral reliability has become a critical engineering challenge. Unlike traditional software, DApps operate in adversarial, permissionless environments where smart contracts execute autonomously and immutably on distributed nodes, making post-deployment correction extremely costly or impossible. This review systematically examines the landscape of functional testing methodologies tailored for decentralized applications, analyzing their suitability, limitations, and practical applicability in modern DApp development workflows. We survey research spanning smart contract verification, consensus protocol testing, oracle interaction validation, cross-chain interoperability testing, and user-layer functional testing of Web3 interfaces. The review identifies four dominant testing paradigms: (1) unit testing of smart contract functions, (2) integration testing of DApp components, (3) property-based testing using formal specifications, and (4) end-to-end simulation on testnets. Through comparative analysis across 13 seminal studies, we evaluate each approach along dimensions of automation feasibility, coverage depth, gas efficiency awareness, and scalability to complex DApp ecosystems. Our findings indicate that while static analysis and symbolic execution tools such as Mythril, Slither, and Manticore offer strong vulnerability detection, they address security properties more than functional correctness. Conversely, framework-based testing tools like Hardhat, Truffle, and Foundry provide adequate unit-level coverage but struggle with cross-contract orchestration and event-driven logic verification. A critical gap exists in testing oracle-dependent and DAO governance workflows. This review concludes with a synthesis of best practices, open research challenges, and a directional roadmap for developing holistic functional testing frameworks suited to the evolving complexity of decentralized systems.

Article
Computer Science and Mathematics
Computer Science

D. Sneha

Abstract: Blockchain networks now underpin mission-critical services in finance, healthcare, supply-chain logistics, and digital governance, yet production deployments continue to suffer severe resilience failures ranging from Byzantine consensus violations to cross-chain bridge exploits that have collectively caused losses exceeding $2 billion. The root cause is a critical tooling gap: ex- isting frameworks such as BlockBench and Hyperledger Caliper evaluate only crash-fault performance and provide neither ad- versarial fault modelling nor automated remediation guidance, leaving operators without a rigorous means of holistic resilience assessment prior to deployment.This paper presents the Blockchain Resilience Analysis System (BRAS), a five-layer, platform-agnostic framework that unifies real-time network topology monitoring, multi-class adversarial fault injection, composite resilience scoring, closed-loop adaptive consensus reconfiguration, and structured reporting within a single repeatable pipeline. BRAS introduces the Resilience Index (RI), a mathematically grounded composite metric that aggre- gates four sub-dimensions—network connectivity, throughput stability, mean-time-to-recovery (MTTR), and Byzantine fault tolerance ratio—into a single interpretable score calibrated to operator-defined service-level objectives. An Adaptive Reconfigu- ration Module (ARM) monitors the RI stream and autonomously adjusts consensus timeout parameters and peer-connection poli- cies when the RI drops below a configurable threshold, closing the feedback loop between fault detection and remediation without manual intervention.Experimental evaluation on a 20-node Hyperledger Fabric testnet and a 15-node Ethereum Proof-of-Authority network demonstrates that BRAS achieves a 34% reduction in MTTR under simulated eclipse attacks and reduces false-positive fault detections by 28% relative to threshold-only monitoring base- lines. The RI metric exhibits strong correlation (r = 0.91, p < 0.001) with independently measured system availability across 50 fault campaigns, validating its predictive utility. BRAS is the first framework to simultaneously address network-layer, consensus-layer, and application-layer resilience threats under a unified, vendor-agnostic architecture, offering both a rigor- ous theoretical foundation and a deployable implementation blueprint for blockchain resilience engineering.

Article
Computer Science and Mathematics
Computer Science

Sindhuja A

Abstract: Wildfire spread prediction demands hyper-local accuracy at scales unattainable by traditional physics-based models or coarse satellite observations. This paper introduces a novel Digital Twin AI framework leveraging 5G IoT mesh networks to deliver real-time, 10m×10m resolution fire propagation forecasts with 5-60-minute lead times. Deployed across 1,200 self-healing sensor nodes, the system fuses multi-modal environmental data thermal anomalies, 3D winds profiles, dynamic fuel moisture at 100Hz through graph attention networks, feeding physics-informed neural twins synchronized via unscented Kalman filtering. The edge-optimized prediction engine combines convolutional cellular automata with graph neural networks, achieving 42% IoU improvement over FARSITE baselines while executing 8.2ms inference cycles on Jetson Orin NPUs. Federated learning across mesh nodes enables continuous adaptation without compromising operational privacy, while INT4 quantization and RTOS scheduling guarantee sub-10ms end-to-end latency critical for first responder activation. The framework scales linearly to 10K nodes, reduces false alerts by 73%, and maintains 99.999% uptime through dynamic routing around fire-damaged sensors. This work establishes a new paradigm for autonomous wildfire intelligence, transforming reactive response into proactive hyper local containment.

Article
Computer Science and Mathematics
Computer Science

Vidhata Phani Datta Seethepalli

Abstract: Community reintegration of formerly incarcerated individuals is one of the most pressing challenges confronting criminal justice systems worldwide. High recidivism rates, fragmented service delivery, stigma, and inadequate coordination among correctional agencies, social service providers, and communities collectively undermine successful reintegration outcomes. Artificial intelligence (AI) offers transformative potential to address these systemic deficiencies through data-driven risk assessment, personalised service matching, and continuous behavioural monitoring. However, no comprehensive, ethically grounded architectural framework currently exists that integrates these capabilities into a unified community reintegration platform. This paper proposes the AI-based Community Reintegration Integration Platform (AI-CRIP), a five-layer architectural framework designed to support the full reintegration lifecycle—from prerelease assessment through post-release community stabilisation. The proposed framework integrates machine learning-based risk classification, natural language processing (NLP) for needs extraction, K-nearest neighbour (KNN) service matching, predictive recidivism analytics, blockchain-based audit trails, and a human-in-the-loop caseworker review mechanism. A formal pseudo-algorithm details the core plan-generation pipeline, demonstrating how structured offender profiles are transformed into personalised, milestone-driven reintegration plans. The framework is evaluated against fifteen representative studies from the existing literature spanning risk assessment models, digital reintegration tools, fairness in algorithmic decision-making, and technology-assisted supervision. The proposed architecture advances the state of the art by synthesising these disparate research threads into a coherent, deployable platform that prioritises fairness, transparency, and individual dignity. Critically, while AI-based tools such as emotive robots, digital avatars, and immersive virtual reality environments have emerged as low-stakes social surrogates for individuals experiencing isolation and withdrawal, they remain limited in their capacity to cultivate genuine human intimacy. Lasting reintegration therefore demands that technological aids be balanced by structural reforms addressing work-life balance, social inclusion, and community belonging, recognising that even highly personalised AI cannot substitute for the human connection that effective rehabilitation ultimately requires. Key technical, ethical, and policy challenges—including algorithmic bias, data privacy, digital inclusion, and stakeholder trust—are also discussed, with directions for future empirical validation. This work contributes a blueprint for practitioners, policymakers, and technology developers seeking to harness AI responsibly in post- carceral rehabilitation.

Article
Computer Science and Mathematics
Computer Science

Sibananda Behera

,

Namita Panda

,

Sudhansu Shekhar Patra

Abstract: Software-Defined Network (SDN) renders flexible traffic engineering, but consumes a lot of energy. There is an overhead on the control-plane because of flow-rule updates are done always and the energy consumption by the forwarding hardware. Current energy-aware SDN methods mostly focus on static or greedy optimisations. This can cause too many Ternary Content-Addressable Memory (TCAM) updates and unstable rule churn when traffic changes over time. This article introduces a Dynamic Flow Rule Placement (DFRP) framework for real-time energy optimisation in SDN. It reduces network energy usage, TCAM update costs, and rule churn all at the same time. The suggested framework uses convex relaxation method to take decisions on binary switches, links, and rule placement. It also uses a minimum-edit round scheme that only allows small rule changes between time slots. To further reduce instability in the control plane, batch scheduling and receding horizon optimisation (RHO) techniques are combinedly used. The system uses predicted traffic for future time slots to make decisions, but only the actions for the current time slot are executed.. The experiments are carried out on two real-world dynamic SNDlib topologies such as Germany50 and Nobel-Germany, using 288 five-minute traffic matrices over a one-day period. Comparative results against static and greedy baselines show that DFRP saves approx 30% energy while cutting down on TCAM update overhead and rule churn by approx 20%, consistently across both networks. Hence DFRP can be used on a dynamic traffic large scale networks for stable and energy-efficient SDN operations.

Article
Computer Science and Mathematics
Computer Science

Shulin Yuan

,

Bowen He

Abstract: Edge computing requires safe, efficient, and generalizable online decision-making, yet existing methods suffer from reactive constraint handling, fragmented scheduling frameworks, and poor generalization. We propose PACE, a unified framework shifting from reactive remediation to proactive anticipation. PACE integrates a Proactive Constrained Policy Optimizer with preemptive penalty and constraint-aware intrinsic rewards, a Nested Index Scheduler with closed-form policies for preemptive and non-preemptive AoI minimization, and a Generalizable Multi-Objective Offloading Network with histogram encoding and masking for single-policy generalization. Experiments on safe locomotion, MEC scheduling, and multi-objective offloading show PACE achieves highest returns with strict constraint satisfaction, reduces AoI by up to 61.84%, and attains near-optimal Pareto performance within 0.3% of the upper bound using a single policy.

Article
Computer Science and Mathematics
Computer Science

Thangamari D

Abstract: Insider threats pose a persistent and evolving challenge to contemporary software ecosystems, where privileged users can exploit access for malicious purposes, often evading traditional perimeter-based defences. This paper introduces a novel hybrid framework that synergistically integrates zero-knowledge proofs (ZKPs) and behavioural analytics to detect and mitigate such threats with enhanced privacy and precision. ZKPs enable secure authentication and data verification without revealing sensitive information, ensuring compliance with privacy regulations like GDPR while thwarting unauthorized access. Complementarily, our behavioural analytics engine employs advanced machine learning models, including graph neural networks and unsupervised anomaly detection (e.g., isolation forests), to profile user behaviours across software pipelines, identifying deviations indicative of insider malice. The proposed architecture is deployed in a microservices-based ecosystem, demonstrating scalability via containerized components on Kubernetes. Extensive evaluations on benchmark datasets (e.g., CERT Insider Threat) and simulated enterprise environments yield a 95% detection accuracy, with 40% fewer false positives than state-of-the-art methods like UEBA systems. Latency remains under 50ms for real-time operations, preserving performance in high-throughput scenarios. Our framework outperforms baselines by 25% in F1-score, validated through rigorous ablation studies. By bridging cryptographic privacy with AI-driven intelligence, this work advances proactive security for modern software, offering deployable solutions against sophisticated insiders. Future extensions explore quantum-resistant ZKPs for post-quantum resilience.

Article
Computer Science and Mathematics
Computer Science

Amit Patole

Abstract: The rapid proliferation of large language model (LLM) powered multi-agent systems creates a non-trivial combinatorial optimization problem: routing heterogeneous tasks to the most cost-effective model tier while maintaining quality guarantees. Current production systems rely on static lookup tables, which over-provision expensive models and waste computational budget. We formalize the LLM Cascade Routing Problem (LCRP) as a Quadratic Unconstrained Binary Optimization (QUBO) problem and solve it using the Quantum Approximate Optimization Algorithm (QAOA). We benchmark QAOA against greedy heuristics and simulated annealing using both Google Cirq simulation and real IBM Quantum hardware (156-qubit Heron processors). Experiments across three IBM backends (ibm_fez, ibm_kingston, ibm_marrakesh) on problem instances from 6 to 18 qubits reveal three key findings: (i) shallow QAOA circuits (p=1, depth 52) achieve 15.4% valid assignment rate on real hardware versus 0.8% for deeper circuits (p=2, depth 101), demonstrating that NISQ noise favors shallow ansatze; (ii) hardware constraint satisfaction degrades steeply with problem size, dropping from 37-43% at 6 qubits to 0.2-0.3% at 18 qubits; and (iii) results are reproducible across all three backends with consistent valid rates within plus or minus 1.5%. To our knowledge, this is the first quantum computing formulation of the LLM model routing problem. We provide an open-source implementation and discuss the projected quantum advantage horizon.

Article
Computer Science and Mathematics
Computer Science

Aleksandra Ivanov

,

Lazar Stošić

,

Olja Krčadinac

,

Vladimir Đokić

,

Dragana Đokić

Abstract: This paper presents a mathematical formalization of human–computer interaction under a zero-distance constraint, introducing a degenerate formulation of Fitts’s Law. In classical models, movement time depends logarithmically on spatial distance and target size. By enforcing D→0, the Index of Difficulty converges to zero, and movement time reduces to a constant equal to the physiological intercept, yielding a constant-time interaction model. A rigorous ε–δ limit analysis proves convergence, while an optimization formulation shows that zero-distance interaction achieves the global minimum of latency. From a control-theoretic perspective, the model eliminates nonlinear dependencies and produces a time-invariant system. The framework is empirically validated on a teleoperated mobile robotic platform using a haptic Touch-Release protocol. Experimental results show a reduction in total response latency from approximately 1040 ms to 450 ms (≈56%). Cryptographically secured telemetry (AES-256) ensures data integrity and reproducibility. The proposed model establishes a new paradigm of constant-time human–computer interaction, with implications for optimization and control in cyber-physical systems and safety-critical applications.

Article
Computer Science and Mathematics
Computer Science

Krish Mithra Nagamothu

Abstract: As blockchain technology evolves from specialized financial tools to foundational infrastructure for Web3, the necessity for rigorous performance validation becomes paramount. Stress testing—defined as the evaluation of system stability under extreme workloads—is critical for identifying bottlenecks in consensus mechanisms and peer-to-peer communication. This survey provides an exhaustive analysis of web-based stress testing frameworks. Unlike traditional CLI-based tools, web-based frameworks provide real-time telemetry and distributed orchestration capabilities essential for modern decentralized applications. We categorize existing literature into three generations of benchmarking, evaluate ten prominent frameworks based on a multi-dimensional rubric, and identify significant research gaps including the lack of standardized cross-chain stress protocols and AI-integrated anomaly detection. This work aims to provide a roadmap for researchers and DevOps engineers to select and implement robust testing environments for enterprise-grade blockchain deployments.

Article
Computer Science and Mathematics
Computer Science

Gajji Shivateja

Abstract: Progressive Web Applications (PWAs) have emerged as a transformative paradigm in modern software engineering, combining the reach of the web with the capabilities of native applications. Simulta- neously, decentralized systems—anchored by blockchain technology, distributed ledger frameworks, and peer-to-peer networking protocols—are reshaping trust architectures across industries ranging from finance and healthcare to supply chain and digital identity. Despite the clear synergies between these two technological pillars, the intersection of PWAs and decentralized systems remains relatively underexplored in the academic literature. This survey addresses that gap by systematically reviewing and analyzing the convergence of PWA design principles with decentralized infrastructure paradigms. We examine how service workers, Web App Manifests, push notifications, and IndexedDB offline storage can be effectively integrated with blockchain nodes, smart contracts, IPFS-based content stor- age, and decentralized identity (DID) frameworks to produce resilient, censorship-resistant, and user- centric applications. We survey thirteen seminal works spanning cross-platform application devel- opment, blockchain architecture, decentralized identity management, IoT integration, and distributed application (DApp) design. Our analysis reveals recurring challenges including transaction latency, key management complexity, offline consistency under Byzantine fault conditions, and the tension between decentralization purity and user experience expectations. We further synthesize findings through a structured comparative analysis across six dimensions: focus area, PWA feature utilization, blockchain integration depth, reported performance metrics, and identified limitations. Based on this synthesis, we identify open research directions and propose guidelines for practitioners seeking to build production-grade PWA-based DApp frontends. This survey contributes a consolidated reference for researchers and engineers working at the intersection of web engineering and decentralized computing.

of 66

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated