Preprint
Essay

This version is not peer-reviewed.

ARFU-IDS: Robust Federated Unlearning for Transformer-Based Intrusion Detection in IoT and Sensor Networks

Submitted:

14 May 2026

Posted:

14 May 2026

You are already at the latest version

Abstract
As IoT and wireless sensor networks (WSNs) increasingly rely on federated intrusion detection, the ability to remove a client’s contribution from a trained model without full retraining has become an important requirement. However, existing federated unlearning methods are not well suited to transformer-based intrusion detection systems, particularly when the unlearning trajectory may be manipulated and multiple removal requests must be processed under severe class imbalance. We present ARFU-IDS, a transformer-oriented and adversary-aware federated unlearning framework. ARFU-IDS combines attention- head attribution, dual-path layer criticality probing, trajectory verification, and conflict- aware scheduling. Specifically, the proposed Attention-Head Attribution Graph localizes removal-sensitive heads in transformer layers, Dual-Path Layer Criticality Probing sepa- rates task-critical layers from adversary-influenced layers, Manipulation-Resistant Iterative Verification with Audit validates whether the unlearning trajectory follows the expected optimization path, and a conflict-graph scheduler supports concurrent client removal while preserving rare-category performance. Experiments on UNSW-NB15, CICIoT2023, and IoTID20 show that ARFU-IDS achieves 87.1% Macro-F1 and 77.6% rare-category recall on UNSW-NB15, reduces the attack success rate to 8.2% at f = 0.1 and 9.7% at f = 0.2, and shortens concurrent unlearning latency by 43.4% compared with sequential FU-IDS. These findings suggest that ARFU-IDS offers a practical framework to robust federated unlearning in transformer-based IDSs for IoT and sensor-network environments.
Keywords: 
;  ;  ;  ;  ;  ;  ;  ;  ;  
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated