Preprint Article Version 2 Preserved in Portico This version is not peer-reviewed

Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment

Version 1 : Received: 2 May 2023 / Approved: 3 May 2023 / Online: 3 May 2023 (05:56:00 CEST)
Version 2 : Received: 30 May 2023 / Approved: 1 June 2023 / Online: 1 June 2023 (04:39:38 CEST)

A peer-reviewed article of this Preprint also exists.

Zhou, B.; Zhao, J.; Yan, C.; Zhang, X.; Gu, J. Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment. Appl. Sci. 2023, 13, 7016. Zhou, B.; Zhao, J.; Yan, C.; Zhang, X.; Gu, J. Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment. Appl. Sci. 2023, 13, 7016.

Abstract

With the increasing utilization of intelligent mobile devices for online inspection of electrical equipment in smart grids, the limited computing power and storage capacity of these devices pose challenges for deploying large algorithm models and it’s hard to obtain a substantial number of images of electrical equipment in public. In this paper, we propose a novel distillation method that compresses the knowledge of teacher networks into a compact few-shot classification network, employing a global and local knowledge distillation strategy. Central to our method is exploiting the global and local relationship between the features exacted by the backbone of the teacher network and student network. We compare our method with recent state-of-the-art (SOTA) methods on three public datasets and achieve superior performance. Additionally, we contribute a new dataset, namely EEI-100, which is specifically designed for classification of electrical equipment. We validate our method on this dataset and demonstrate its exceptional prediction accuracy of 94.12% when utilizing only 5-shot images.

Keywords

few-shot classification; electrical equipment images; knowledge distillation

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (1)

Comment 1
Received: 1 June 2023
Commenter: Bojun Zhou
Commenter's Conflict of Interests: Author
Comment: 1.The  keywords are revised as follows: few-shot classification; electrical equipment images; knowledge distillation.
2.We have provided a more detailed description to elaborate on the content of the Figure 1.
3.According to reviewers’ recommendation, we have restructured the manuscript. Section 4 has been renumbered as Section 3,  Section 4.1 has become Section 3.2, and Section 5 has become Section 4.
4.According to reviewers’ recommendation, we have revised and rephrased the concluding section to provide a more concise and impactful summary of our findings. 
5.We have revised the manuscript to improve the English language throughout the entire paper.
+ Respond to this comment

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 1
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.