Engineering

Sort by

Review
Engineering
Telecommunications

Panagiotis K. Gkonis

,

Anastasios Giannopoulos

,

Nikolaos Nomikos

,

Lambros Sarakis

,

Vasileios Nikolakakis

,

Gerasimos Patsourakis

,

Panagiotis Trakadas

Abstract: The goal of the study presented in this work is to analyze all recent advances in the context of the computing continuum and meta–operating systems (meta-OSs). The term continuum includes a variety of diverse hardware and computing elements as well as network protocols, ranging from lightweight internet of things (IoT) components to more complex edge or cloud servers. To this end, the rapid penetration of IoT technology in modern era networks along with associated applications poses new challenges towards efficient application deployment over heterogeneous network infrastructures. These challenges involve among others the interconnection of a vast number of IoT devices and protocols, proper resource management, as well as threat protection and privacy preservation. Hence, unified access mechanisms, data management policies and security protocols are required across the continuum to support the vision of seamless connectivity and diverse device integration. This task becomes even more important as discussions on sixth generation (6G) networks are already taking place, which they are envisaged to coexist with IoT applications. Therefore, in this work the most significant technological approaches to satisfy the aforementioned challenges and requirements are presented and analyzed. To this end, a proposed architectural approach is also presented and discussed which takes into consideration all key players and components in the continuum. In the same context, indicative use cases and scenarios that are leveraged from a meta-OS in the computing continuum are discussed as well. Finally, open issues and related challenges are also discussed.
Article
Engineering
Telecommunications

Galia Marinova

,

Edmond Hajrizi

,

Besnik Qehaja

,

Vassil Guliashki

Abstract: This study presents the development of a smart microgrid control framework. The goal is to achieve optimal energy management and maximize photovoltaic (PV) generation utilization through a combination of optimization and reinforcement learning techniques. A detailed Simulink model is developed in MATLAB to represent the dynamic behavior of the microgrid, including load variability, temperature profiles, and solar radiation. Initially, a genetic algorithm (GA) is used to perform static optimization and parameter tuning – identifying optimal battery charging/discharging schedules and balancing power flow between buildings in the microgrid to minimize main grid dependency. After that a Soft Actor-Critic (SAC) reinforcement learning agent is trained to perform real-time maximum power point tracking (MPPT) for the PV system under different environmental (weather) and load conditions. The SAC agent learns from multiple (eight) simulated PV generation scenarios and demand profiles, optimizing the duty cycle of the DC-DC converter to adaptively maintain maximum energy yield. The combined GA-SAC approach is validated on a university campus microgrid consisting of four interconnected buildings with heterogeneous loads, including computer labs that generate both active and reactive power demands. The results show improved efficiency, reduced power losses, and improved energy autonomy of the microgrid, illustrating the potential of AI-driven control strategies for sustainable smart energy systems.
Article
Engineering
Telecommunications

Anastasia Daraseliya

,

Eduard Sopin

,

Julia Kolcheva

,

Vyacheslav Begishev

,

Konstantin Samouylov

Abstract: Modern 5G+grade low power wide area network (LPWAN) technologies such as Narrowband Internet-of-Things (NB-IoT) operate utilizing a multi-channel slotted ALOHA algorithm at the random access phase. As a result, the random access phase in such systems is characterized by relatively low throughput and is highly sensitive to traffic fluctuations that could lead the system outside of its stable operational regime. Although theoretical results specifying the optimal transmission probability that maximizes the successful preamble transmission probability are long known, the lack of knowledge about the current offered traffic load at the BS makes the problem of maintaining the optimal throughput a challenging task. In this paper, we propose and analyze a new reactive access barring scheme for NB+IoT systems based on machine learning (ML) techniques. Specifically, we first demonstrate that knowing the number of user equipments (UE) expierencing a collision at the BS is sufficient to make conclusions about the current offered traffic load. Then, we show that utilizing ML-based techniques, one can safely differentiate between events in the Physical Random Access Channel (PRACH) at the base station (BS) side based on only the signal-to-noise ratio (SNR). Finally, we mathematically characterize the delay experienced under the proposed reactive access barring technique. In our numerical results, we show that by utilizing modern neural network approaches, such as the XGBoost classifier, one can precisely differentiate between events on the PRACH channel with accuracy reaching 0.98 and then associate it with the number of user equipment (UE) competing at the random access phase. Our simulation results show that the proposed approach can keep the successful preamble transmission probability constant at approximately 0.3 in overloaded conditions, when for conventional NB-IoT access, this value is less than 0.05. The proposed scheme achieves near-optimal throughput in multi-channel ALOHA by employing dynamic traffic awareness to adjust the non-unit transmission probability. This proactive congestion control ensures a controlled and bounded delay, preventing latency from exceeding the system’s maximum load capacity.
Article
Engineering
Telecommunications

Alessandro Vizzarri

Abstract: The Satellite Internet of Things (IoT) sector is undergoing rapid transformation, driven by breakthroughs in satellite communications and the pressing need for seamless global coverage—especially in remote and poorly connected regions. In locations where terrestrial infrastructure is limited or non- existent, Low Earth Orbit (LEO) satellites are proving to be a game-changing solution, delivering low- latency and high-throughput links well-suited for IoT deployments. While North America currently dominates the market in terms of revenue, the Asia-Pacific region is projected to lead in growth rate. Nevertheless, the development of satellite IoT networks still faces hurdles, including spectrum regulation and international policy alignment. In this evolving landscape, the LoRa and LoRaWAN protocols have been enhanced to support direct communication with LEO satellites, typically operating at altitudes between 500 km and 2,000 km. This paper offers a comprehensive review of current research on LoRa/LoRaWAN technologies integrated with LEO satellite systems, also providing a performance assessment of this combined architecture in terms of theoretical achievable bitrate, Bit Error rate (BER), and path loss.
Article
Engineering
Telecommunications

Alex T. de Cock Buning

,

Ivan Vidal

,

Francisco Valera

Abstract: Connecting distributed applications across multiple cloud-native domains is growing in complexity. Applications have become containerized and fragmented across heterogeneous infrastructures, such as public clouds, edge nodes and private data centers, including emerging IoT-driven environments. Existing networking solutions like CNI plugins and service meshes have proven insufficient for providing isolated, low-latency and secure multi-cluster communication. By combining SDN control with Kubernetes abstractions, we present L2S-CES, a Kubernetes-native solution for multi-cluster layer-2 network slicing that offers flexible isolated connectivity for microservices while maintaining performance and automation. In this work, we detail the design and implementation of L2S-CES, outlining its architecture and operational workflow. We experimentally validate against state-of-the-art alternatives, and show superior isolation, reduced setup time, native support for broadcast and multicast, and minimal performance overhead. By addressing the current lack of native link-layer networking capabilities across multiple Kubernetes domains, L2S-CES provides a unified and practical foundation for deploying scalable, multi-tenant, and latency-sensitive cloud-native applications.
Article
Engineering
Telecommunications

Charalampos Papapavlou

,

Konstantinos Paximadis

,

Dan M Marom

,

Ioannis Tomkos

Abstract: Emerging services such as artificial intelligence (AI), 5G, the Internet of Things (IoT), cloud data services and teleworking are growing exponentially, pushing bandwidth needs to the limit. Space Division Multiplexing (SDM) in the spatial domain, along with Ultra-Wide Band (UWB) transmission in the spectrum domain, represent two degrees of freedom that will play a crucial role in the evolution of backbone optical networks. SDM and UWB technologies necessitate the replacement of conventional Wave-length-Selective-Switch (WSS)-based architectures with innovative optical switching elements capable of handling both higher port counts and flexible switching across various granularities. In this work, we introduce a novel Photonic Integrated Circuit (PIC)-based switching element called flex-Waveband Selective Switch (WBSS), designed to provide flexible band switching across the UWB spectrum (~21 THz). The proposed flex-WBSS supports a hierarchical three-layered Multi-Granular Optical Node (MG-ON) architecture incorporating optical switching across various granularities ranging from entire fibers, flexibly defined bands down to individual wavelengths. To evaluate its performance, we develop a custom network simulator, enabling a thorough performance analysis on all node’s critical performance metrics. Simulations are conducted over an existing network topology evaluating three traffic-oriented switching policies: Full Fiber Switching (FFS), Waveband Switching (WBS), and Wavelength Switching (WS). Simu-lation results reveal high Optical to Signal Ratio (OSNR) and low Bit Error Rate (BER) values, particularly under the FFS policy. In contrast, the integration of the WBS policy bridges the gap between existing WSS- and future FFS-based architectures, manages to relax bandwidth demands and mitigates capacity bottlenecks enabling rapid scalable network upgrades in existing infrastructures. Additionally, we propose a probabilistic framework to evaluate the node’s bandwidth utilization and scaling behavior, exploring tradeoffs among scalability, components’ number and complexity. The proposed framework can be easily adapted for the design of future transport optical networks. Finally, we perform a SWaP-C (Size, Weight, Power and Cost) analysis to underscore the advantages of the proposed architecture. Our results confirm that the proposed node incorporating PIC-based flex-WBSSs, prove to be a cost- and power-efficient solution compared to its counterparts, as reduce excessive initial network deployment/upgrade costs while at the same time minimizing power resource requirements.
Article
Engineering
Telecommunications

Burke Geceyatmaz

,

Fatma Tansu Hocanın

Abstract: This research presents a proposed hybrid routing protocol for Vehicular Ad-hoc Networks (VANETs), designed to address the performance trade-offs inherent in purely reactive Ad hoc On Demand Distance Vector (AODV) and proactive Optimized Link State Routing (OLSR) routing paradigms. The purpose of the research is to seamlessly integrate the strengths of AODV and OLSR into a single, context-aware framework. A significant finding is the co-design of a dynamic transmission power control mechanism that works in concert with the routing logic to adapt to fluctuating vehicle densities in real-time, effectively mitigating intermittent connectivity and the high latency characteristic of large-scale Intelligent Transportation Systems (ITS). Rigorous evaluation under high-fidelity mobility scenarios (using NS-3 and SUMO with real-world traffic patterns) confirms the protocol's efficacy. The significant findings demonstrate substantial performance enhancements over established baseline protocols, consistently achieving a Packet Delivery Ratio (PDR) exceeding 90%, maintaining an end-to-end delay below the critical 40ms threshold, and realizing per-node energy savings of up to 60%. The conclusion is that this work provides a validated foundation for a highly reliable and efficient communication fabric, significantly enhancing the dependability of mission-critical ITS services and promoting the development of scalable, sustainable next-generation transportation networks.
Article
Engineering
Telecommunications

Dipon Saha

,

Illani Mohd Nawi

Abstract: The booming number of EVs and autonomous vehicles is driving the demand for the development of 5G and connected vehicle technologies. However, the design of compact, multi-band vehicular antennas with multiple communication standard support is complex. Traditional experience-based and parameter-sweeping approaches to antenna optimization are often inefficient and limited in scalability, while machine learning-based methods require extensive datasets, which are computationally intensive. This study proposes a microstrip planar Yagi antenna optimized for Sub-6 GHz 5G and C-V2X communication. As a way to approach antenna optimization with lower computing cost and less data, a hybrid optimization strategy is presented that combines parametric analysis and curve fitting based data visualization approaches. The proposed antenna exhibits a reflection coefficient of -31.68 dB and -29.36 dB with 700 MHz and 900 MHz bandwidths for frequencies of 3.5 GHz and 5.9 GHz, respectively. Moreover, the proposed antenna exhibits a peak gain of 7.55 dB with a size of 0.44×0.64 λ2, while achieving a peak efficiency of 90.1%. The antenna has been integrated and simulated in a model Mini Cooper to test the effectiveness of vehicular communication.
Review
Engineering
Telecommunications

Mohammed Zaid

,

Rosdiadee Nordin

,

Ibraheem Shayea

Abstract: The rapid integration of unmanned aerial vehicles (UAVs) into next-generation wireless systems demands seamless and reliable handover (HO) mechanisms to ensure continuous connectivity. However, frequent topology changes, high mobility, and dynamic channel variations make traditional HO schemes inadequate for UAV-assisted 6G networks. This paper presents a comprehensive review of existing HO optimization studies, emphasizing artificial intelligence (AI) and machine learning (ML) approaches as enablers of intelligent mobility management. The surveyed works are categorized into three main scenarios: non-UAV HOs, UAVs acting as aerial base stations, and UAVs operating as user equipment, each examined under traditional rule-based and AI/ML-based paradigms. Comparative insights reveal that while conventional methods remain effective for static or low-mobility environments, AI- and ML-driven approaches significantly enhance adaptability, prediction accuracy, and overall network robustness. Emerging techniques such as deep reinforcement learning and federated learning (FL) demonstrate strong potential for proactive, scalable, and energy-efficient HO decisions in future 6G ecosystems. The paper concludes by outlining key open issues and identifies future directions toward hybrid, distributed, and context-aware learning frameworks for resilient UAV-enabled HO management.
Review
Engineering
Telecommunications

Shujat Ali

,

Asma Abu-Samah

,

Mohammed H. Alsharif

,

Rosdiadee Nordin

,

Nauman Saqib

,

Mohammed Sani Adam

,

Umawathy Techanamurthy

,

Manzareen Mustafa

,

Nor Fadzilah Abdullah

Abstract: The next generation of wireless communication, 6G, promises a leap beyond the advances of 5G, aiming not only to increase speed but also to redefine how people, machines, and environments interact. This paper examines the evolution from 5G Advanced to 6G through a detailed review of 3GPP Releases 15-20, outlining the progression from enhanced mobile broadband to intelligent services supporting holographic communication, remote tactile interaction, and immersive XR applications. Three foundational service pillars are identified in this evolution: immersive communication, everything connected, and high-precision positioning. These advances enable transformative use cases such as virtual surgery, cooperative drone swarms, and AI-driven agriculture, demanding innovations in spectrum utilization (including sub-THz bands), AI-native network architectures, and energy-efficient device ecosystems. Future networks are expected to deliver peak data rates up to 1~Tbps, localization accuracy below 10~cm, and device densities reaching 10M/km2, while sustaining end-to-end latency under 1~ms. Across Releases 15-20, 3GPP has progressively standardized capabilities for XR, positioning, scheduling, and sustainability, while initiatives such as RedCap, Ambient IoT, and NTN extend connectivity toward global, low-power, and cost-effective coverage. Supported by programs like Hexa-X and the Next G Alliance, 6G is positioned as a fundamental redesign of wireless communication centered on intelligence, adaptability, inclusivity, and sustainability.
Article
Engineering
Telecommunications

Spyridon Louvros

,

AnupKumar Pandey

,

Brijesh Shah

,

Yashesh Buch

Abstract: This paper introduces a novel methodology that integrates 6G wireless Federated Edge Learning (FEEL) frameworks with optimization strategies spanning from the MAC layer to the Physical layer. In the context of mobile edge computing, ensuring robust channel estimation within the 6G network slicing paradigm presents critical challenges, particularly in managing data retransmissions. Inaccurate updates from distributed 6G devices can undermine the reliability of Federated Learning (FL), affecting its overall performance. To address this, we propose an AI/ML assisted algorithm for global optimization in FL-based 6G networks, where the decision-making process leverages radial basis functions (RBF) to assess options based on learned preferences rather than relying on direct evaluations of the objective function.
Review
Engineering
Telecommunications

Evelio Astaiza Hoyos

,

Héctor Fabio Bermúdez-Orozco

,

Jorge Alejandro Aldana-Gutierrez

Abstract: The advent of sixth-generation (6G) communications envisions a paradigm of ubiqui-tous intelligence and seamless physical–digital fusion, demanding unprecedented performance from the optical transport infrastructure. Achieving terabit-per-second capacities, microsecond latency, and nanosecond synchronisation precision requires a convergent, flexible, open, and AI-native x-Haul architecture that integrates commu-nication with distributed edge computing. This study conducts a systematic literature review of recent advances, challenges, and enabling optical technologies for intelligent and autonomous 6G networks. Using the PRISMA methodology, it analyses sources from IEEE, ACM, and major international conferences, complemented by standards from ITU-T, 3GPP, and O-RAN. The review examines key optical domains including Coherent PON (CPON), Spatial Division Multiplexing (SDM), Hollow-Core Fibre (HCF), Free-Space Optics (FSO), Photonic Integrated Circuits (PICs), and reconfigura-ble optical switching, together with intelligent management driven by SDN, NFV, and AI/ML. The findings reveal that achieving 6G transport targets will require synergistic integration of multiple optical technologies, AI-based orchestration, and nanosec-ond-level synchronisation through Precision Time Protocol (PTP) over fibre. However, challenges persist regarding scalability, cost, energy efficiency, and global standardisa-tion. Overcoming these barriers will demand strategic R&D investment, open and programmable architectures, early AI-native integration, and sustainability-oriented network design to make optical fibre a key enabler of the intelligent and autonomous 6G ecosystem.
Article
Engineering
Telecommunications

Steven O. Awino

,

Bakhe Nleya

Abstract: The indoor low-voltage power line network is characterized by highly irregular interferences, where background noise coexists with bursty impulsive noise originating from household appliances and switching events. Traditional noise models, which are considered monofractal models, often fail to reproduce the clustering, intermittency, and long-range dependence seen in measurement data. In this paper, a Multifractal Random Walk (MRW) framework tailored for Power Line Communication (PLC) noise modelling is developed. MRW is a continuous time limit process based on discrete time random walks with stochastic log-normal variance. As such, the formulated MRW framework introduces a stochastic volatility component that modulates Gaussian increments, thus generating heavy-tailed statistics and multifractal scaling laws which are consistent with the measured PLC noise data. Empirical validation is done through structure function analysis and covariance of log-amplitudes, both of which reveal scaling characteristics that align well with MRW simulated predictions. This proposed model captures both the bursty nature and correlation structure of impulsive PLC noise more effectively as compared to the conventional monofractal approaches, thereby providing a mathematically grounded framework for accurate noise generation and robust system-level performance evaluation of PLC networks.
Article
Engineering
Telecommunications

Dabao Wang

,

Yanhong Xu

,

Zhangbo Gao

,

Hanqing Ding

,

Shitong Zhu

,

Zhao Li

Abstract: To address the challenges posed by increased power consumption in traditional active relays and the difficulties associated with countering channel fading for Intelligent Reflecting Surfaces (IRS), we propose a dual-mode relay (DMR) capable of dynamically switching between two operational modes: active relaying and passive IRS reflection. This DMR allows its units (DMRU) to select their operational modes dynamically based on channel conditions, enabling the transmission of composite-mode signals that consist of both active relaying signal components and IRS-reflected components, thereby enhancing adaptation to the wireless environment. Furthermore, under the constraint of limited transmit power, we introduce a DMR-based Adaptive Transmission (DMRAT) method. This approach explores all possible DMR operational modes and employs the Alternating Optimization (AO) algorithm in each mode to jointly optimize the beamforming matrices of both the transmitter and the DMR, along with the reflection coefficient matrix of the IRS. Consequently, this maximizes the data transmission rate for the target communication pair. The optimal DMR mode can then be determined based on the optimized data rate for the target communication across various operational modes. Simulation results demonstrate that the proposed method significantly enhances the data transmission rate for the target communication pair.
Article
Engineering
Telecommunications

Hyun Jun Kim

,

Meong Hun Lee

Abstract: Efficient, high-frequency collection of environmental data is crucial for real-time monitoring and automated management in smart livestock settings. However, existing RS485-based polling method has limitations. Therefore, in this study, a parallel Message Queuing Telemetry Transport (MQTT)-based communication structure was designed for multisensory structures. The proposed system enables the broker to send messages simultaneously from distributed sensor nodes and uses a topic-based hierarchical structure to ensure scalability and processing efficiency. The structure’s performance was experimentally tested using three scenarios: changing the node count, changing the transmission interval, and comparing communication methods. The proposed MQTT structure showed superior performance to RS485 in terms of the mean latency (98 ms vs 178 ms), message reception success rate (99.2% vs 94.5%), processing rate (1.33 msg/s vs 0.67 msg/s), and communication error frequency (1.3 errors/1000 trials vs 5.7 errors/1000 trials). In addition the topic-based message structure was shown to enable dynamic scalability with diverse environmental sensors. This study empirically demonstrated the utility and stability of an MQTT-based communication structure in a smart livestock setting that requires high-frequency data collection. These findings could be used as foundational technology for intelligent automated livestock implementations.
Review
Engineering
Telecommunications

Idris Balarabe Aliyu

,

Caroline Omonatse Alenoghena

,

Suleiman Zubair

,

Nathaniel Salawu

Abstract: The implementation of the Internet of Medical Things (IoMT) in telemedicine necessitates the deployment of various medical devices, including wearable and implantable sensors, in a network to collect diverse categories of medical data, resulting in the accumulation of large data volumes. The interoperability of these IoMT devices and networks determines to what extent the devices and networks can exchange and interpret medical data. The lack of standardised communication protocols for sharing medical data between individual devices and networks remains a serious challenge that needs to be addressed to unravel the potential of healthcare information sharing using IoMT. This article presents an exploration of practicable IoMT communication protocols to identify possible means of enabling cross-platform interoperability between devices and networks. Taxonomy of IoMT-based telemedicine communication protocols with use-case examples leveraging a large number of existing works of literature, have been examined. The use of open API, gateways and microservices for effective cross-platform medical data sharing is common due to the lack of standard communication protocols. Standardisation of IoMT communication protocols is possible by adopting the most effective open-source communication protocol such as the Message Queuing Telemetry Transport (MQTT) Protocol.
Article
Engineering
Telecommunications

Md Nurul Absar Siddiky

,

Muhammad Enayetur Rahman

Abstract: This study addresses beam prediction in Terahertz (THz) Unmanned Aerial Vehicle (UAV) communication, a critical component of sixth-generation (6G) wireless networks. While the THz spectrum enables high-speed data transmission, UAV mobility introduces challenges like signal impairments and blockages. To overcome this, a deep learning model using self-attention networks is proposed, integrating Reconfigurable Intelligent Surfaces (RIS) to enhance coverage and adapt to channel dynamics. The approach predicts optimal beams for UAVs based on sequential data, including beam trajectories, position, and Line of Sight (LoS) information. Compared to Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRUs), the model demonstrates superior performance in prediction accuracy, as validated by metrics like Root-Mean-Square Error (RMSE) and Mean Absolute Error (MAE).
Article
Engineering
Telecommunications

Danaisy Prado-Alvarez

,

Daniel Calabuig

,

Saúl Inca

,

Jose F. Monserrat

Abstract: ISAC is envisioned as a foundational technology for future wireless networks, enabling simultaneous wireless communication and environmental sensing using shared resources. A key challenge in ISAC systems lies in managing the trade-off between communication data rate and sensing accuracy, especially in multi-user scenarios. In this work, we investigate the joint design of transmit signal covariance matrices to optimize the sum data rate while ensuring certain sensing performance. Specifically, we formulate a constrained optimization problem where the transmit covariance matrix is allocated to maximize the communication sum-rate under sensing-related constraints. These constraints condition the design of the transmit signal’s covariance matrix, impacting both the sensing channel estimation error and the sum data rate. Our proposed method leverages convex optimization tools to achieve a principled balance between communication and sensing. Numerical results demonstrate that the proposed approach effectively manages the ISAC trade-off, achieving near-optimal communication performance while satisfying sensing requirements.
Article
Engineering
Telecommunications

Roman Kubacki

,

Rafał Przesmycki

,

Marek Bugaj

,

Dariusz Laskowski

Abstract: With the escalating of electromagnetic environment in space, there has been growing attention on the electromagnetic negative effects on unmanned aerial vehicles (UAVs). In the paper the investigation of the potential electromagnetic interference that UAVs may encounter during flight has been presented. Electromagnetic compatibility (EMC) defines requirements necessary to the correct work of UAVs electronics. The immunity threshold of electric field strength according to existing EMC standard for typical commercial drones was established as 10 V/m. On the other hand, European standards for people protection suggest that electromagnetic field emitted by mobile base station antennas can reach level up to 61 V/m. Investigation of drone vulnerability in electromagnetic environment was carried out in the anechoic chamber and it was determined that when electric field strength is 30 V/m or higher the generally available drones have unchecked fall. The UAV’s cables are the main energy coupling path from electromagnetic field. Induced, additional current perturbs the current in drone arms and consequently propellers cannot guarantee the stable flight. Considering the maximum possible base station antenna effective radiated power of 40 dBW per sector the safe distance of R = 20 m from the antenna was specified. The safe distance for UAVs around base station antennas, as a recommendation, should be respected by popular drone users.
Article
Engineering
Telecommunications

Lucio Hernando-Cánovas

,

Alejandro S. Martínez-Sala

,

Juan C. Sánchez-Aarnoutse

,

Juan J. Alcaraz

Abstract: Accurately estimating indoor occupancy is essential for managing building spaces and infrastructure, with applications ranging from ensuring safe distancing and adequate ventilation during health crises to optimizing energy use and resource allocation. Howev-er, no existing technology simultaneously provides accurate, low-cost, and priva-cy-preserving indoor occupancy measurement. This work explores the use of existing WiFi infrastructure as a non-intrusive sensing system, where access points act as soft sensors by passively collecting anonymized connection metadata as proxies for human presence. We validated the approach in a university library over eight months, training supervised machine learning regression models on WiFi data and comparing predictions against computer-vision ground truth. The best-performing models (SVR, Ridge, and MLP) consistently achieved R² ≈ 0.95, with mean absolute errors of about 8 persons and relative errors (SMAPE) below 10% at medium-to-high occupancies. Tree-based ensem-bles, particularly XGBoost, exhibited weaker generalization at extreme capacity ranges, likely due to data sparsity and sensitivity to hyperparameters. Importantly, no temporal degradation was observed across the 8-month horizon, confirming the long-term stability of the method. Overall, the results demonstrate that WiFi-based occupancy estimation can provide a robust, low-cost, and privacy-preserving solution for real-world deployments.

of 14

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated