Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment

Version 1 : Received: 2 May 2023 / Approved: 3 May 2023 / Online: 3 May 2023 (05:56:00 CEST)
Version 2 : Received: 30 May 2023 / Approved: 1 June 2023 / Online: 1 June 2023 (04:39:38 CEST)

A peer-reviewed article of this Preprint also exists.

Zhou, B.; Zhao, J.; Yan, C.; Zhang, X.; Gu, J. Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment. Appl. Sci. 2023, 13, 7016. Zhou, B.; Zhao, J.; Yan, C.; Zhang, X.; Gu, J. Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment. Appl. Sci. 2023, 13, 7016.

Abstract

With the increasing use of intelligent mobile devices for online inspection of electrical equipment in smart grids, the limited computing power and storage of these devices pose challenges for carrying large algorithm models and it’s hard to obtain a large number of images of electrical equipment in public. In this paper, we propose a novel distillation method that compresses the knowledge of teacher networks into a small few-shot classification network using a global and local knowledge distillation strategy. Central to our method is exploiting the global and local relationship between the features exacted by the backbone of the teacher network and student network. We compare our method with recent state-of-the-art methods in three public datasets and achieve the best performance. We also contribute a new dataset, EEI-100, specifically designed for classification of electrical equipment, and demonstrate that our method achieves a prediction accuracy of 94.12% with only 5-shot images.

Keywords

few-shot learning; classification; electrical equipment; knowledge distillation

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.