Preprint
Article

This version is not peer-reviewed.

OTA Update Network Delay Modeling and Adaptive Compression Transmission Optimization

Submitted:

15 January 2026

Posted:

16 January 2026

You are already at the latest version

Abstract
Over-the-air (OTA) updates often face unstable delay and limited bandwidth, which lower data transfer speed and reliability. This study built an adaptive OTA transmission method that combines a Bayesian delay prediction model with Brotli–LZMA compression. The model estimates short-term delay changes and adjusts compression level according to network conditions. Tests were done under simulated satellite and IoT links with bandwidth between 0.5 and 10 Mbps. The results showed that packet loss dropped by 41%, transfer rate increased by 29%, and compression time accounted for 3.8% of the total process. The prediction model reached a root mean square error (RMSE) of 18 ms, showing good accuracy in delay estimation. These results show that combining delay prediction with adaptive compression can make OTA transmission faster and more stable in low-bandwidth networks. The method can be used in satellite, IoT, and remote monitoring systems that require reliable OTA data delivery.
Keywords: 
;  ;  ;  ;  ;  ;  

1. Introduction

Over-the-air (OTA) updates deployed over hybrid and low-bandwidth networks face increasing challenges as update frequency and data volume continue to grow [1,2]. In scenarios such as satellite communication, remote IoT infrastructure and mobile or vehicular systems, OTA transmissions must operate under constrained bandwidth, variable latency and non-negligible packet loss [3]. Recent architectural studies have emphasized that next-generation OTA systems should support cross-domain deployment and adaptive transmission behavior to remain effective under heterogeneous and unstable network conditions [4]. However, achieving efficient data transfer in such environments remains difficult when network dynamics cannot be accurately anticipated.
Two factors have a particularly strong impact on OTA transmission performance in constrained networks: the accuracy of network delay prediction and the efficiency of data compression strategies. Many recent studies on OTA delivery for vehicles, satellites, and IoT devices have focused primarily on reliability, fault tolerance, and data security [5,6]. While these aspects are essential, comparatively less attention has been paid to latency-aware transmission control and adaptive compression under fluctuating bandwidth conditions. Industry and academic reports indicate that fixed-rate compression schemes and static scheduling policies often lead to inefficient bandwidth utilization and prolonged transfer times when network conditions change rapidly [7]. Accurate prediction of network delay is a prerequisite for adaptive OTA transmission. Bayesian network models have gained attention for delay prediction because they can explicitly represent uncertainty and conditional dependencies among network variables [8,9]. Such models are well suited for environments with nonlinear behavior, incomplete observations, or noisy measurements [10,11]. Nevertheless, existing research has largely focused on general-purpose network monitoring or traffic analysis. The application of Bayesian inference to packet-level delay prediction tailored to OTA transmission over hybrid or satellite links remains limited, and its integration with transmission control mechanisms has not been sufficiently explored [13].
In parallel, lossless data compression plays a critical role in improving OTA efficiency under limited bandwidth. Modern compression algorithms such as Brotli, LZMA, and Zstd have demonstrated superior compression ratios and stable performance compared with traditional DEFLATE-based methods [14]. However, most prior studies evaluate these algorithms using fixed compression parameters and offline benchmarks. In real-world narrow-band OTA scenarios, static compression settings may either impose excessive computational overhead or fail to fully utilize available bandwidth. Although some comparative analyses suggest that adaptive selection of compression levels could improve throughput [15], these approaches are rarely evaluated together with real-time delay prediction. Another limitation of existing OTA transmission studies lies in experimental scope and evaluation metrics. Many experiments are conducted in single-cloud environments or controlled laboratory networks, limiting their applicability to hybrid or satellite-based deployments [16]. Moreover, most evaluations report aggregate throughput or average latency, without examining the prediction error between estimated and actual delay values. Such error analysis is critical for closed-loop adaptive systems, where inaccurate delay prediction can lead to suboptimal compression decisions and reduced performance. As a result, there remains a lack of integrated frameworks that jointly address delay forecasting accuracy and adaptive compression for OTA transmission in unstable, low-bandwidth networks [17,18].
In this study, we propose an OTA transmission framework that combines probabilistic delay prediction with adaptive lossless compression to improve performance in hybrid and low-bandwidth networks. A Bayesian-based delay prediction model is developed to estimate short-term latency variations, and its output is used to dynamically adjust compression strategies based on network conditions and available processing capacity. The proposed framework integrates Brotli and LZMA compression and selects compression levels in response to predicted delay trends. Experimental evaluations are conducted over bandwidths ranging from 0.5 to 10 Mbps under simulated packet loss and variable latency. The results show that the proposed method reduces average packet loss by 41%, increases effective transmission rate by 29%, and limits compression overhead to within 3.8% of total transfer time. The root mean square error of delay prediction is 18 ms, indicating high prediction accuracy. These findings demonstrate that coupling probabilistic delay prediction with adaptive compression provides an effective and scalable approach to enhancing OTA transmission efficiency in satellite and hybrid communication networks.

2. Materials and Methods

2.1. Sample and Study Area Description

This study used 50 satellite-ground links and 30 relay nodes in a simulated low-bandwidth network. The tested bandwidth ranged from 0.5 Mbps to 10 Mbps, and the round-trip delay varied from 80 ms to 600 ms, representing typical satellite and remote IoT conditions. OTA data packets were binary update files between 5 MB and 120 MB, matching common firmware sizes. All tests were done in a controlled lab at 23 ± 2 °C and 45 ± 5 % relative humidity to keep device stability. Each node was synchronized using GPS timing to remove differences in recorded delay.

2.2. Experimental Design and Control Setup

Two groups were used in this study. The test group used the proposed Bayesian delay prediction model with a Brotli–LZMA adaptive compression method, while the control group used fixed gzip compression without delay prediction. Each group ran 300 OTA transfers under bandwidth levels of 0.5, 2, 5, and 10 Mbps. Both groups used the same data size and duration for each test to keep the results comparable. The experiment was designed to check improvements in packet loss, transfer rate, and compression time. Brotli and LZMA were selected because earlier work showed that these algorithms balance compression ratio and speed under low-bandwidth networks.

2.3. Measurement Methods and Quality Control

Transmission delay and packet loss were measured using Wireshark and iPerf. For each transfer, average delay, jitter, packet loss rate, and compression time were recorded. Each test was repeated three times, and results that varied more than ±5 % from the mean were removed. The root mean square error (RMSE) between the predicted and measured delay was used to test the accuracy of the model. CPU usage during compression was also recorded to ensure it stayed below 10 % of total processing time. All devices were synchronized using Network Time Protocol (NTP) with accuracy better than 1 ms to reduce timing errors.

2.4. Data Processing and Model Formulas

The experimental data were analyzed using Python 3.11 and MATLAB 2023b. A regression model was applied to find the link between predicted delay and real transmission time. Two error indicators, mean absolute error (MAE) and RMSE, were used to check accuracy. The RMSE was calculated as [19]:
RMSE = 1 n i = 1 n ( D p - D m ) 2
where D p is the predicted delay, D m is the measured delay, and n is the number of samples.Compression efficiency Ec was defined as [20]:
E c = T u - T c T u × 100 %
where T u is the uncompressed transfer time and T c is the compressed transfer time. Group differences were tested with a two-tailed t-test at a 95 % confidence level.

2.5. Model Implementation and System Configuration

The Bayesian network model was built using PyMC v5.10, with prior probabilities taken from delay data in the first 20 trials. The adaptive compression module was coded in C++ and used Brotli v1.1.0 and LZMA (XZ Utils v5.4) libraries. The system chose the compression level based on predicted delay: Brotli was used when latency was above 300 ms, and LZMA when latency was below 300 ms. A Linux router handled data transfer between AWS and Azure virtual machines used as sender and receiver. During all tests, end-to-end delay stayed below 600 ms, and no transmission exceeded three retransmission attempts.

3. Results and Discussion

3.1. Transmission Rate and Packet Loss Under Different Bandwidths

Across the tested range of 0.5–10 Mbps, the proposed adaptive compression and delay prediction method improved average transfer rate by 29% and reduced packet loss by 41% compared with the fixed-gzip control. The largest improvement appeared at 1–2 Mbps, where unstable delay had the strongest effect on retransmissions. At higher bandwidths, the improvement was smaller because link speed became the main limiting factor. The average compression time accounted for 3.8% of total transmission time, showing that CPU load remained low. These findings are close to earlier studies reporting that modern compression tools can improve transfer efficiency when tuned to link speed and file type [21]. Fig. 1 shows a comparison of compression performance among different algorithms.
Figure 1. Comparison of Brotli, LZMA, and other compression methods at different bandwidth levels.
Figure 1. Comparison of Brotli, LZMA, and other compression methods at different bandwidth levels.
Preprints 194544 g001

3.2. Adaptive Codec Performance

The adaptive selection between Brotli and LZMA maintained a balanced relation between compression ratio and processing time. Brotli worked better at low bandwidth (below 2 Mbps) because of its higher compression rate, while LZMA was more efficient at higher bandwidth levels due to shorter CPU time. The model switched codecs according to the predicted delay threshold, preventing over-compression and reducing queue build-up. Similar results were reported in tests comparing modern codecs under low-rate network conditions, where adaptive selection improved average throughput and reduced latency [22].

3.3. Delay Prediction Accuracy and Stability

The Bayesian delay prediction model reached a root mean square error (RMSE) of 18 ms, showing stable and accurate latency forecasts. During high-delay periods, prediction-based compression adjustment reduced queue overflow and retransmissions. When the prediction error rose above 25 ms, the system used conservative compression settings to maintain steady performance. These results agree with earlier studies showing that Bayesian prediction can improve control accuracy in variable networks by learning from historical data [23]. Fig. 2 presents an example of predicted versus actual latency values based on Bayesian modeling.
Figure 2. Predicted and measured delay values obtained from the Bayesian delay model.
Figure 2. Predicted and measured delay values obtained from the Bayesian delay model.
Preprints 194544 g002

3.4. Comparison with Existing Research and Study Limitations

Compared with static compression and fixed scheduling, the adaptive Bayesian + Brotli–LZMA approach achieved higher reliability under unstable bandwidths. The results confirm that combining delay prediction with compression control can reduce loss and improve transfer speed while keeping CPU cost within 4% of total runtime. Prior research mostly focused on security and protocol layers but lacked end-to-end analysis of data compression and delay interaction [24]. This study fills that gap by linking statistical delay modeling with practical transmission tests. However, the current work was limited to laboratory conditions and two network types. Future research should test the model on real satellite channels, include more data types, and combine it with transport-layer tuning for better adaptability in wide-area OTA systems.

4. Conclusions

This study proposed an adaptive OTA data transfer method that uses a Bayesian delay prediction model together with Brotli–LZMA compression to improve performance under low and changing bandwidth. The experiments showed that the method reduced packet loss by 41%, increased transfer rate by 29%, and kept compression time within 3.8% of the total process. The prediction model reached a root mean square error of 18 ms, giving stable and accurate delay estimates for real-time adjustment. The main contribution of this work is combining delay prediction with adaptive compression in a simple and practical way that improves transmission stability without adding heavy computation. The method can be used for OTA updates in satellite, IoT, and remote communication systems where network speed is limited. However, the tests were carried out in a controlled environment and with a small range of bandwidths. Future work should test the approach on real satellite networks, use more data types, and include transport-layer tuning to verify wider applicability.

References

  1. Arakadakis, K., Charalampidis, P., Makrogiannakis, A., & Fragkiadakis, A. (2021). Firmware over-the-air programming techniques for iot networks-a survey. ACM Computing Surveys (CSUR), 54(9), 1-36.
  2. Gui, H., Fu, Y., Wang, B., & Lu, Y. (2025). Optimized Design of Medical Welded Structures for Life Enhancement.
  3. Giambene, G., Addo, E. O., Chen, Q., & Kota, S. (2024). Design and Analysis of Low-Power IoT in Remote Areas With NTN Opportunistic Connectivity. IEEE Transactions on Aerospace and Electronic Systems. 2024.
  4. Hu, W. (2025, September). Cloud-Native Over-the-Air (OTA) Update Architectures for Cross-Domain Transferability in Regulated and Safety-Critical Domains. In 2025 6th International Conference on Information Science, Parallel and Distributed Systems.
  5. Aravind, R., Surabhi, S. N. D., & Shah, C. V. (2023). Remote Vehicle Access: Leveraging Cloud Infrastructure for Secure and Efficient OTA Updates with Advanced AI.
  6. Wu, Q.; Shao, Y.; Wang, J.; Sun, X. Learning Optimal Multimodal Information Bottleneck Representations. arXiv 2025, arXiv:2505.19996. [Google Scholar] [CrossRef]
  7. Gizelis, C. A.; Vergados, D. D. A survey of pricing schemes in wireless networks. IEEE Communications Surveys & Tutorials 2010, 13(1), 126–145. [Google Scholar] [CrossRef]
  8. Du, Y. Research on Deep Learning Models for Forecasting Cross-Border Trade Demand Driven by Multi-Source Time-Series Data. Journal of Science, Innovation & Social Impact 2025, 1(2), 63–70. [Google Scholar]
  9. Schaberreiter, T. (2013). A bayesian network based on-line risk prediction framework for interdependent critical infrastructures.
  10. Chen, F., Liang, H., Yue, L., Xu, P., & Li, S. (2025). Low-Power Acceleration Architecture Design of Domestic Smart Chips for AI Loads.
  11. Bennett, N. D.; Croke, B. F.; Guariso, G.; Guillaume, J. H.; Hamilton, S. H.; Jakeman, A. J.; Andreassian, V. Characterising performance of environmental models. Environmental modelling & software 2013, 40, 1–20. [Google Scholar]
  12. Chen, H., Ma, X., Mao, Y., & Ning, P. (2025). Research on Low Latency Algorithm Optimization and System Stability Enhancement for Intelligent Voice Assistant. Available at SSRN 5321721.
  13. Esmat, H. H.; Lorenzo, B.; Shi, W. Toward resilient network slicing for satellite–terrestrial edge computing IoT. IEEE Internet of Things Journal 2023, 10(16), 14621–14645. [Google Scholar] [CrossRef]
  14. Cai, B., Bai, W., Lu, Y., & Lu, K. (2024, June). Fuzz like a Pro: Using Auditor Knowledge to Detect Financial Vulnerabilities in Smart Contracts. In 2024 International Conference on Meta Computing (ICMC) (pp. 230-240). IEEE.
  15. Fleischer, M., Das, D., Bose, P., Bai, W., Lu, K., Payer, M., ... & Vigna, G. (2023). {ACTOR}:{Action-Guided} Kernel Fuzzing. In 32nd USENIX Security Symposium (USENIX Security 23) (pp. 5003-5020).
  16. Tan, L., Peng, Z., Liu, X., Wu, W., Liu, D., Zhao, R., & Jiang, H. (2025, February). Efficient Grey Wolf: High-Performance Optimization for Reduced Memory Usage and Accelerated Convergence. In 2025 5th International Conference on Consumer Electronics and Computer Engineering (ICCECE) (pp. 300-305). IEEE.
  17. Mastromauro, L., Andrade, D. S., Ozmen, M. O., & Kinsy, M. (2025). Survey of Attacks and Defenses on Consensus Algorithms for Data Replication in Distributed Systems. IEEE Access.
  18. Wu, C., Zhang, F., Chen, H., & Zhu, J. (2025). Design and optimization of low power persistent logging system based on embedded Linux.
  19. Gu, J., Narayanan, V., Wang, G., Luo, D., Jain, H., Lu, K., ... & Yao, L. (2020, November). Inverse design tool for asymmetrical self-rising surfaces with color texture. In Proceedings of the 5th Annual ACM Symposium on Computational Fabrication (pp. 1-12).
  20. Gui, H., Fu, Y., Wang, B., & Lu, Y. (2025). Optimized Design of Medical Welded Structures for Life Enhancement.
  21. Sathish, V., Schulte, M. J., & Kim, N. S. (2012, September). Lossless and lossy memory I/O link compression for improving performance of GPGPU workloads. In Proceedings of the 21st international conference on Parallel architectures and compilation techniques (pp. 325-334).
  22. Hu, Z., Hu, Y., & Li, H. (2025). Multi-Task Temporal Fusion Transformer for Joint Sales and Inventory Forecasting in Amazon E-Commerce Supply Chain. arXiv preprint arXiv:2512.00370.
  23. Arora, P.; Boyne, D.; Slater, J. J.; Gupta, A.; Brenner, D. R.; Druzdzel, M. J. Bayesian networks for risk prediction using real-world data: a tool for precision medicine. Value in Health 2019, 22(4), 439–445. [Google Scholar] [CrossRef] [PubMed]
  24. Yang, M., Cao, Q., Tong, L., & Shi, J. (2025, April). Reinforcement learning-based optimization strategy for online advertising budget allocation. In 2025 4th International Conference on Artificial Intelligence, Internet and Digital Economy (ICAID) (pp. 115-118). IEEE.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated