Computer Science and Mathematics

Sort by

Article
Computer Science and Mathematics
Computer Science

Marco D. Ferraro

,

Giulia R. Conti

,

Lorenzo M. Bianchi

Abstract: Machine learning-based phishing detectors are vulnerable to adversarially crafted URLs that preserve malicious intent while evading lexical classifiers. This work investigates adversarial robustness for phishing URL detection and introduces a defense strategy that combines character-level adversarial training with distributional regularization. We construct an evaluation benchmark of 280,000 benign and 120,000 phishing URLs, and generate over 1.5 million adversarial variants using obfuscation rules, homoglyph substitution, and gradient-based attacks. A character-level CNN–BiLSTM classifier is trained with adversarial examples and a Wasserstein distance-based regularizer to keep internal representations of benign and phishing distributions well separated. Under strong white-box attacks, our defended model maintains an AUC of 0.958 and accuracy of 91.2%, outperforming non-robust baselines by more than 12 percentage points. The results suggest that adversarially aware training is critical for deploying phishing detectors in adversarial settings where attackers actively optimize for evasion.

Article
Computer Science and Mathematics
Computer Science

Nithya Moorthy

Abstract: This research introduces an edge-optimized reinforcement learning (RL) ecosystem engineered for sustainable logistics in the blue economy, spanning maritime shipping, automated port operations, and offshore resource transportation. At its core, the system processes vast streams of real-time data from IoT sensors embedded in vessels, buoys, and drones directly at edge nodes, bypassing cloud latency to enable instantaneous decision-making in unpredictable marine conditions like storms or currents. Carbon capture analytics, derived from spectroscopic sensors quantifying direct air capture (DAC) efficiency and CO2 sequestration rates on ships, dynamically adjusts RL reward functions to favour fuel-efficient paths that maximize emissions offsets, aligning with International Maritime Organization (IMO) mandates for net-zero operations by 2050. The framework exploits 6G networks' terabit speeds, sub-millisecond latency, and non-terrestrial network integration via low-earth-orbit satellites for seamless swarm intelligence orchestration. Autonomous agents unmanned surface vessels (USVs), aerial drones, and autonomous underwater vehicles (AUVs) exhibit flocking behaviour’s inspired by particle swarm optimization, sharing pheromone-like digital signals over holographic beamforming channels to collaboratively resolve complex tasks like dynamic routing, collision avoidance, and load redistribution. Methodologically, proximal policy optimization (PPO) algorithms facilitate stable, lightweight training on resource-constrained edge hardware, augmented by federated learning to aggregate insights across privacy-sensitive multi-operator fleets without central data pooling. Rigorous evaluations in NS-3 for 6G emulation and Gazebo for maritime physics reveal transformative gains: 42% reductions in carbon footprints, 65% lower end-to-end latency versus 5G-cloud hybrids, and 30% improvements in throughput under adverse weather. Scalability tests with 1000+ agents confirm robustness in GPS-denied zones, while ablation studies highlight the synergistic impact of carbon feedback and swarm coordination over siloed baselines like genetic algorithms or centralized RL. By embedding quantum-safe encryption for 6G links and digital twin interfaces for predictive maintenance, this ecosystem not only decarbonizes blue economy logistics but also sets a scalable blueprint for AI-driven sustainability in cyber-physical systems worldwide.

Article
Computer Science and Mathematics
Computer Science

Karthiga Devi R

Abstract: This paper presents a transformer-infused semantic sensing ecosystem that integrates post-quantum signatures with 6G-enabled digital twins to enable adaptive orchestration in next-generation smart systems. Conventional IoT architectures struggle with semantic understanding across heterogeneous sensor streams, vulnerability to quantum attacks, and synchronization delays between physical and digital representations. The proposed platform deploys transformer models optimized for multi-modal sensor fusion to extract contextually rich semantic features from raw measurements, feeding these insights into digital twins synchronized over 6G networks with microsecond precision. Post-quantum lattice-based signatures ensure data integrity and authentication across the high-velocity sensing-orchestration pipeline, resisting both classical and quantum adversaries. The adaptive orchestration engine leverages twin predictions and semantic context to generate control policies that optimize system performance under dynamic conditions. Evaluation across industrial, urban, and autonomous transport scenarios demonstrates 3.8× improvement in semantic inference accuracy, 92% reduction in twin synchronization error, and 28% latency reduction compared to baseline architectures, while maintaining quantum-resistant security guarantees. The framework establishes a blueprint for secure, semantically-aware smart ecosystems capable of real-time adaptive orchestration at 6G scale.

Article
Computer Science and Mathematics
Computer Science

Meenalochini Pandi

Abstract: Unmanned aerial vehicles (UAVs), commonly known as drones, face fundamental limitations in flight duration due to battery constraints, typically restricting operations to 20-60 minutes. This paper proposes an energy-efficient drone design leveraging a solar-powered hybrid propulsion system to achieve extended endurance, targeting 24 hours continuous flight and beyond under realistic conditions. Drawing inspiration from adaptive edge intelligence frameworks, the design integrates high-efficiency photovoltaic (PV) cells, lithium-based batteries, reinforcement learning (RL) for power management, and swarm coordination for fleet optimization. Key innovations include wing-integrated solar harvesting, RL-driven mode switching between solar-direct cruise and battery-boost climb, and bio-inspired energy sharing among drone swarms. Simulations and extrapolated prototypes demonstrate 4-5x endurance gains, with applications in persistent surveillance, environmental monitoring, and disaster response. Evaluations confirm 30-40% energy savings versus conventional designs, while maintaining payload capacity >2 kg.

Article
Computer Science and Mathematics
Computer Science

Rosen Ivanov

Abstract: This article presents a scalable IoT-based architecture for continuous and passive monitoring of patient behavior in home environments, designed as a technical founda-tion for future dementia risk assessment systems. The architecture integrates: (1) wearable BLE sensors with infrared room-level localization capability; (2) edge-computing gateways with local preprocessing and machine learning capability; (3) a three-channel data stream optimizing latency, bandwidth, and information com-pleteness; and (4) a federated learning framework enabling model development with-out data sharing between multiple institutions. Technical validation in two apartments (3 participants, 7 days) demonstrated: 97.6% room-level localization accuracy using infrared beacons; 41.2% network bandwidth reduction through intelligent compres-sion; less than 7 seconds end-to-end latency for 99.5% of critical events; and 98.5% deduplication accuracy in multi-gateway configurations. A proof-of-concept federated learning simulation confirms architectural feasibility of collaborative model training while preserving privacy, achieving convergence in five rounds with 1.4 MB commu-nication per institution. Cost analysis shows ~€490 for initial implementation and ~€55 monthly operation, representing 5-10 times lower costs than existing research systems (ORCATECH, SENDA). The development ensures the technical and economic feasibil-ity of continuous home monitoring for behavioral analysis. Clinical validation of di-agnostic capabilities through longitudinal studies with validated cognitive assess-ments remains a task for future work.

Article
Computer Science and Mathematics
Computer Science

Ji-Hye Oh

,

Hyun-Seok Park

Abstract:

This study examines how different programming paradigms are associated with learning experiences and cognitive challenges as encountered by non-computer science novice learners. Using a case-study approach situated within specific instructional contexts, we integrate survey data from undergraduate students with large-scale public question-and-answer data from Stack Overflow to explore paradigm-related difficulty patterns. Four instructional contexts—C, Java, Python, and Prolog—were examined as pedagogical instantiations of imperative, object-oriented, functional-style, and logic-based paradigms using text clustering, word embedding models, and interaction-informed complexity metrics. The analysis identifies distinct patterns of learning challenges across paradigmatic contexts, including difficulties related to low-level memory management in C-based instruction, abstraction and design reasoning in object-oriented contexts, inference-driven reasoning in Prolog-based instruction, and recursion-related challenges in functional-style programming tasks. Survey responses exhibit tendencies that are broadly consistent with patterns observed in public Q&A data, supporting the use of large-scale community-generated content as a complementary source for learner-centered educational analysis. Based on these findings, the study discusses paradigm-aware instructional implications for programming education tailored to non-major learners within comparable educational settings. The results provide empirical support for differentiated instructional approaches and offer evidence-informed insights relevant to curriculum-oriented teaching and future research on adaptive learning systems.

Article
Computer Science and Mathematics
Computer Science

Huachang Su

,

Yekang Zhao

,

Wenrui Zhang

,

Hongling Zhang

,

Shitao Huang

,

Sheng Zhong

,

Xiaoyang Zhou

Abstract: Edge drones continuously collect sensitive information such as telemetry data during missions, making it difficult to apply centralized model training directly due to privacy protection, security compliance, and regulatory constraints. Although federated learning (FL) can avoid sharing raw data, existing federated learning schemes based solely on homomorphic encryption (HE) still face security risks in drone scenarios, such as gradient inversion, member inference, and malicious update injection. To address this, we propose a secure and verifiable edge federated learning framework for parameter-efficient model adaptation in drone scenarios. The framework introduces homomorphic encryption for model updates on the device side to protect the privacy of updates before transmission and aggregation. Simultaneously, on the server side, decryption, aggregation, and verification are performed through a remotely authenticated Trusted Execution Environment (TEE), thereby limiting the server's access to plaintext updates and reducing the feasibility of gradient inversion and member inference attacks at the system level. Furthermore, an aggregation signature mechanism is introduced to batch verify the identity and update integrity of participating nodes, effectively preventing malicious or tampered updates from participating in aggregation, thus overcoming the shortcomings of existing HE-FL schemes in terms of poisoning resistance and verifiability. Experimental results show that, while ensuring safety and verifiability, the proposed method improves model accuracy by 3% compared to the comparative scheme, while maintaining better performance in terms of computation and communication overhead, thus verifying the practicality and deployability of the framework in resource-constrained UAV edge environments.

Article
Computer Science and Mathematics
Computer Science

Jaime Sayago-Heredia

,

Tatiana Landivar

,

Roberto Vásconez

,

Wilson Chango-Sailema

Abstract: This study develops a spatio-temporal forecasting artifact for road traffic accidents in Ecuador, addressing a critical limitation in existing predictive approaches that rely predominantly on point error metrics without validating the statistical assumptions underlying forecast uncertainty. Motivated by pronounced territorial heterogeneity in accident incidence and the need for reliable decision-support tools, the research proposes a multiregional modeling framework that integrates statistical residual validation to enhance the robustness of road safety planning. Using a dataset of 27{,}648 monthly observations covering all 24 provinces from 2014 to 2025, the study applies the Prophet model within a Design Science Research paradigm and a CRISP-DM implementation cycle. Separate provincial models are estimated with a 24-month forecasting horizon, and methodological rigor is ensured through systematic residual diagnostics using the Shapiro--Wilk test for normality and the Ljung--Box test for temporal independence. Empirical results indicate that the Prophet-based artifact outperforms a naïve seasonal benchmark in 70.8\% of the provinces, demonstrating excellent predictive accuracy in structurally stable regions such as Tungurahua (MAPE = 10.9\%). At the same time, the framework enables the identification of critical emerging risks in provinces such as Santo Domingo and Cotopaxi, where projected increases exceed 49\% despite acceptable point forecasts. The findings confirm that point accuracy alone does not guarantee the validity of confidence intervals and that residual validation is essential for trustworthy uncertainty quantification. Overall, the proposed approach provides a robust foundation for a predictive surveillance system capable of supporting differentiated, evidence-based road safety policies in territorially heterogeneous contexts.

Article
Computer Science and Mathematics
Computer Science

Steven Coleman

,

Daniel Wilson

Abstract:

The paradigm shift toward cloud-based big data analytics has empowered organizations to derive actionable insights from massive datasets through scalable, on-demand computational resources. However, the migration of sensitive data to third-party cloud environments introduces profound privacy concerns, ranging from unauthorized data access to the risk of re-identification in multi-tenant architectures. This paper provides a comprehensive evaluation of current Privacy-Preserving Mechanisms (PPMs), systematically analyzing their efficacy in safeguarding data throughout its lifecycle—at rest, in transit, and during computation. The evaluation covers a broad spectrum of Privacy-Enhancing Technologies (PETs), including Differential Privacy (DP), Homomorphic Encryption (HE), Secure Multi-Party Computation (SMPC), and Trusted Execution Environments (TEEs). We examine the inherent trade-offs between data utility and privacy protection, specifically addressing the “utility-privacy” bottleneck where high levels of noise injection or encryption complexity often degrade the accuracy and performance of analytical models. Furthermore, the study explores the integration of Federated Learning as a decentralized approach to privacy, allowing for collaborative model training without the need for raw data movement. Critical challenges are identified, such as the scalability of cryptographic protocols in high-volume data streams and the regulatory pressures imposed by global standards like the GDPR and the EU AI Act. By synthesizing current industry practices with academic research, this paper highlights the gap between theoretical privacy models and their practical implementation in production-grade cloud infrastructures. The discourse concludes with a strategic roadmap for future research, emphasizing the need for Post-Quantum Cryptography (PQC) and automated privacy-orchestration frameworks. This comprehensive review serves as a foundational reference for researchers and system architects aiming to design resilient, privacy-centric cloud analytical systems that maintain compliance without sacrificing computational efficiency.

Article
Computer Science and Mathematics
Computer Science

Selvaprasanth P

Abstract: This paper proposes an innovative vision-language model (VLM) driven predictive platform that synergistically integrates swarm robotics coordination with post-quantum digital signatures to enable fully autonomous navigation for green vessels eco-friendly ships leveraging hybrid renewable propulsion systems such as biofuels, hydrogen fuel cells, and wind-assisted technologies. Traditional maritime navigation systems struggle with dynamic oceanic conditions, including unpredictable weather patterns, high-traffic congestion, and escalating cyber threats, which compromise supply chain efficiency and sustainability goals. The proposed framework addresses these challenges by deploying a multimodal VLM core that fuses real-time visual data from onboard LiDAR, infrared cameras, and radar with textual inputs from AIS (Automatic Identification System) broadcasts, satellite weather forecasts, and nautical charts. This fusion generates interpretable probabilistic predictions of future states, such as wave-induced trajectory deviations or collision risks, enabling proactive rerouting that minimizes hydrodynamic drag and emissions.Swarm robotics augments individual vessel autonomy through decentralized fleets of unmanned surface vehicles (USVs) that dynamically form protective convoys or scouting formations, optimizing collective energy use via bio-inspired particle swarm optimization conditioned on VLM outputs. To safeguard against quantum computing vulnerabilities inherent in classical RSA or ECC protocols, the platform embeds lightweight Dilithium or Falcon post-quantum signatures for authenticating sensor streams, cargo manifests, and inter-vessel commands, ensuring non-repudiation even under harvest-now-decrypt-later attacks. Validation occurs within high-fidelity maritime digital twins that simulate full-scale operations, incorporating computational fluid dynamics for vessel hydrodynamics and stochastic perturbations for resilience testing.Extensive simulations demonstrate transformative performance fuel savings exceed 38% via predictive eco-routing, collision avoidance precision reaches 97.2% in dense fog scenarios, and cryptographic overhead remains below 5% bandwidth utilization on edge-constrained USVs. Supply chain resilience improves markedly, with recovery time from simulated disruptions reduced by 52%, fortifying global logistics against climate volatility and geopolitical risks. This work pioneer’s end-to-end integration of VLMs, swarms, and quantum-safe primitives, laying a robust foundation for scalable, secure, and sustainable autonomous maritime ecosystems aligned with UN Sustainability Development Goals.

Review
Computer Science and Mathematics
Computer Science

Shubham Singh

Abstract: The rapid advancement of deep learning technologies has markedly influenced numerous sectors, particularly agriculture. This survey paper presents an exhaustive review of contemporary trends, challenges, and future perspectives in the application of deep learning to agricultural tasks. By meticulously analyzing 95 research papers published in 2020, this review categorizes studies based on application areas, deep learning methodologies, data sources, targeted crops, and utilized frameworks. The findings highlight the predominance of Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), with Keras and TensorFlow emerging as the most frequently employed frameworks. Primary data sources include camera and satellite imagery. Key applications explored encompass plant disease detection, weather forecasting, crop yield prediction, and plant classification. Additionally, the paper underscores performance metrics and model accuracies, with disease detection models frequently surpassing 95% accuracy. Challenges such as data availability, model generalization, and computational costs are critically examined, alongside potential future directions for integrating emerging technologies to enhance agricultural productivity and sustainability. This survey aims to provide researchers and practitioners with a comprehensive understanding of the landscape of deep learning applications in agriculture, highlighting areas ripe for future research.

Article
Computer Science and Mathematics
Computer Science

Ran Zhang

,

Yongchao Shen

,

Qianru Wu

Abstract: In order to solve the problems of insufficient privacy protection and limited sharing of industrial Internet security situation data,a situation element extraction model integrating federated learning and deep learning was proposed. This model integrates deep residual networks, bidirectional long short-term memory networks, and Transformer architecture,which extract features from network security situation data from multiple dimensions such as local features, temporal characteristics, and global correlations, and establish a situation element extraction model. Under the federated learning architecture,each participant performs data processing and model updates locally,transmitting model parameters through security mechanisms to reduce unnecessary data sharing and flow. The experimental results show that this method further improves the situation element extraction performance while protecting data privacy.

Article
Computer Science and Mathematics
Computer Science

Abel Yeboah-ofori

,

Awo Aidam Amenyah

Abstract: Background: Child sexual exploitation and abuse have been an existing global phe-nomenon. However, with increasing dependency on digital transformation, mobile de-vices, and the internet, the emphasis has shifted to child online sexual exploitation and abuse (COSEA), leading to an exponential growth of perpetrators. A 2020 report indi-cated a 200% increase in child sex abuse forums that are linked to the internet. Existing literature has emphasized child protection challenges, online attacks, and using surveys and questionnaires to gather and draw inferences regarding grooming tactics and the-matic analysis. Social Issues, such as the lack of reporting platforms, limited sharing of threat information, cyber awareness, and social engagement and support, pose serious challenges for children, parents, and law enforcement. Several papers exist that have used the term Online Child Sexual Exploitation and Abuse (OCSEA). However, our paper considers Child Online Sexual Exploitation and Abuse (COSEA) as we explore and look at it from the challenges of a child going online and accessing the internet. Methods: The paper explores COSEA challenges and examines how perpetrators deploy MITRE Tactics, Techniques, and Procedures (TTPs) against victims to understand attack motives and establish potential attributions for cyber threat intelligence gathering and cyber profiling. The paper acknowledges existing research by considering the changing threat landscape and the evolving attack surface. It aims to contribute to the body of knowledge on adversarial TTPs and current trends, and to understand the threat actor’s mindset and motives. Results: The results demonstrate that analyzing TTPs facilitates the establishment of attributions and the determination of the adversary’s intents, motives, opportunities, and methods. The novelty contributions of this research are threefold. First, we explore existing challenges in online child abuse and exploitation by identifying and discussing what constitutes child abuse and exploitation, how COSEA manifests, and the attack methods used by perpetrators to exploit their victims. Secondly, we used the MITRE TTP and subjective judgment approach to identify the TTPs and determine how these factors make the child complicit. Finally, we discuss the strategies required to address the challenges and the stakeholder role in mitigating COSEA Conclusion: The paper has considered TTPs from a technical perspective to understand the perpetrator's motives. The paper considers factors that could influence the victim, such as money, societal norms, and deterrence, including education, laws, regulations, and recommendations for threat information-sharing platforms and collaborations among stakeholders.

Article
Computer Science and Mathematics
Computer Science

Rahul Sharma

,

Steven Coleman

Abstract: The digital transformation of university administrative services is essential for managing the complexities of modern campus logistics. Student accommodation management, in particular, requires a robust infrastructure to handle high volumes of sensitive data, financial transactions, and real-time resource allocation. This research details the Design and Development of a Cloud-Based Student Accommodation Management Application, a solution designed to replace inefficient legacy systems with a scalable Software-as-a-Service (SaaS) model. The platform utilizes cloud-native architecture to provide a seamless interface for students while offering powerful automation tools for administrators. Evaluation of the system focuses on performance metrics and user-centric validation. Results demonstrate that the integration of cloud computing significantly improves operational transparency and reduces the latency associated with manual housing assignments. This study provides a technical framework for deploying enterprise-level cloud applications within the higher education sector.

Review
Computer Science and Mathematics
Computer Science

Saravana Srinivasan

,

Pedro Paiva

,

Aditi Dharmadhikari

,

Lyall Sathishkumar

,

Christian Nwobu

,

Ningyue Mao

,

Guilherme Hollweg

,

Xuan Zhou

,

Xiao Zhang

Abstract: As demand for EVs (Electric Vehicles), WSNs (Wireless Sensor Networks), and IoT (Internet of Things) devices continues to grow, efficient battery health monitoring has emerged as a critical requirement. Conventional BMS (Battery Management System) designs rely on wired, centralized architectures, which are not only costly and less scalable but also highly prone to operational failures. To mitigate these inherent drawbacks, recent studies have shifted toward exploring wireless, low-power, and contactless alternatives. This paper reviews emerging sensing solutions and machine learning techniques for battery state and health estimation. It also examines WBMS (Wireless Battery Management System) advancements from theoretical frameworks to prototypes, covering health monitoring, cycle/discharge tracking, thermal management, and second-life reuse. Additionally, we discuss integrating techniques including EIS (electrochemical impedance spectroscopy), ultrasonic sensing with IoT systems and advanced machine learning models. Furthermore, it explores innovative diagnostic approaches and highlights algorithmic frameworks for real-time diagnostics. Overall, this work provides a comprehensive view of intelligent, wireless battery monitoring technologies and identifies key challenges and research opportunities for scalable deployment in cyber-physical systems.

Article
Computer Science and Mathematics
Computer Science

R Karthick

Abstract: The adoption of static and dynamic code analysis techniques within modern software development environments is critical for early vulnerability detection and comprehensive quality assurance. Static code analysis scrutinizes source code without execution to uncover potential defects, security vulnerabilities, and coding standard violations early in the lifecycle. Dynamic code analysis complements this by examining the software's runtime behavior to identify issues such as memory leaks, race conditions, and interaction faults that only manifest during execution. The integration of both methodologies into automated security toolchains within continuous integration/continuous delivery (CI/CD) pipelines enables rapid feedback, efficient remediation, and elevated code quality. This combined approach fosters a culture of proactive security and accelerates the delivery of robust, secure software applications.

Article
Computer Science and Mathematics
Computer Science

Sayed Mahbub Hasan Amiri

,

Prasun Goswami

,

Chandan Kumar Barmmon

,

Md Mainul Islam

,

Mohammad Shakhawat Hossen

,

Mohammad Sohel Kabir

,

Marzana Mithila

,

Naznin Akter

Abstract: The limitations of classical computing in solving complex problems in cryptography, materials science, and optimization necessitate the development of a new computational paradigm based on the principles of quantum mechanics. This article aims to analyse the current state of quantum computing hardware, evaluate the primary challenges to achieving fault tolerance, and project a realistic timeline for its practical application. The methodology involves a systematic review and comparative analysis of publicly available empirical data from peer-reviewed literature and corporate technical roadmaps, employing a framework of key performance indicators such as coherence times, gate fidelities, and qubit counts to assess leading qubit modalities, including superconducting circuits, trapped ions, and photonic systems. The analysis confirms that while superconducting qubits currently lead in scalability, with demonstrations of quantum supremacy using 53-qubit processors, trapped ion platforms maintain a significant advantage in gate fidelity and coherence times. The central finding identifies decoherence and high error rates as the fundamental barriers, necessitating that current Noisy Intermediate-Scale Quantum (NISQ) devices rely on error mitigation techniques rather than robust quantum error correction. The comparative assessment concludes that no single qubit modality yet fulfils all DiVincenzo criteria for fault tolerance simultaneously. The path to scalable quantum computing is shown to depend on the successful implementation of topological error-correcting codes like the surface code, which currently requires thousands of physical qubits to create a single stable logical qubit. Projections based on current progress suggest that while demonstrations of quantum utility on specific problems are imminent, fully fault-tolerant quantum computers capable of breaking RSA encryption or revolutionizing drug discovery remain a long-term endeavour, likely requiring several more decades of intensive research and development. The practical value of this research lies in its synthesized technical overview, which provides a clear, evidence-based roadmap for researchers, engineers, and policymakers to navigate the technological hurdles and strategic investments required to realize the transformative potential of quantum computing.

Article
Computer Science and Mathematics
Computer Science

Sasikala M

Abstract: Utilizing the third-party library, the safe framework integration system resolves the serious problem of dependency risk and license violation in software development. This means that we are trying to place security mechanisms into the framework that works to defend against vulnerabilities that are introduced by third-party components. It also includes Open-Source Software (OSS) governance automation to monitor and enforce compliance and license obligations, and control legal and operational risks. Merging secure integration practices with automated governance allows organizations to reduce security risks as well as license compliance risks. Thus, they can effectively manage and ensure a secure software supply chain. This paper presents a framework to integrate third-party libraries to minimize the security risks related to dependencies, and to prevent the violation of licenses through automation of Open-Source Software (OSS) governance. Their approach involves embedding the automated validation of dependencies, scanning for licensing compliance, assessing for vulnerabilities and monitoring on a continuous basis within DevSecOps pipelines to empower the proactive enforcement of policies defined by the organization. Tests in a controlled testbed show a 75% drop in known vulnerabilities over 3 months and over 95% license compliance in different projects. While it does add moderate build-time overhead, it generally is fine for CI. The study finds that use of automated governance tools helps to secure and comply software supply chain without hindering development productivity. Future research will use artificial intelligence to predict vulnerabilities and enhance the automation of licence interpretation to strengthen the effectiveness of OSS governance further.

Article
Computer Science and Mathematics
Computer Science

Ade Kurniawan

Abstract: Deep learning-based Human Activity Recognition (HAR) systems using multimodal wearable sensors are increasingly deployed in safety-critical applications including healthcare monitoring, elderly care, and security authentication. However, the vulnerability of these systems to adversarial attacks remains insufficiently understood, particularly for attacks that must evade detection while manipulating multiple sensor modalities simultaneously. This paper presents STAR-RL (Stealth-aware Targeted Adversarial attack via Reinforcement Learning), a novel framework that generates effective and stealthy adversarial examples against multimodal sensor-based HAR systems. STAR-RL introduces three key innovations: (1) a multi-strategy attack engine that adaptively selects among diverse perturbation algorithms based on real-time attack progress, (2) a sensor-aware stealth mechanism that concentrates perturbations on naturally noisy sensors to minimize detection likelihood, and (3) a reinforcement learning-based meta-controller that learns optimal attack policies through interaction with the target classifier. Comprehensive experiments on the MHEALTH dataset demonstrate that STAR-RL achieves 95.20% attack success rate, substantially outperforming baseline methods including FGSM (6.00%), PGD (88.60%), and C&W (69.00%). The stealth analysis confirms that 51.35% of perturbation energy is successfully directed to weak sensors (gyroscopes and magnetometers), validating the effectiveness of the sensor-aware allocation strategy. Our findings reveal critical security vulnerabilities in production HAR systems and provide insights for developing robust defense mechanisms against adaptive adversarial threats.

Article
Computer Science and Mathematics
Computer Science

Yongheng Li

,

Jing Wen

,

Shaoling Liang

,

Fanqi Kong

,

Baohua Huang

Abstract: Multi-Group Homomorphic Encryption (MGHE) is a pivotal advance in secure multi-party computation, integrating merits of Multi-Party Homomorphic Encryption (MPHE) and Multi-Key Homomorphic Encryption (MKHE) to eliminate MPHE’s fixed-party limitation and mitigate MKHE’s ciphertext expansion from dynamic enrollment. However, the efficient single-key FINAL scheme cannot extend to multi-party scenarios, due to the challenge of defining valid multiplication for vector NTRU ciphertexts, which hinders its use in multi-group bootstrapping and curbs efficiency. To address this, additive secret sharing is adopted to convert vector NTRU ciphertext multiplication into secret share multiplication, enabling shared bootstrapping key generation within groups. For the first time, a multi-group ciphertext bootstrapping algorithm based on LWE and NTRU is proposed. Bootstrapping tasks are decomposed for parallel processing, and a hybrid product algorithm is designed to aggregate subtask outputs, boosting multi-group bootstrapping speed to match that of single-key ciphertexts. Noise accumulation is analyzed, with 100-bit and 128-bit security parameter sets selected for validation. Experiments show that 30/50-party multi-group bootstrapping takes only 1.87/2.58 seconds respectively.

of 60

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated