Version 2
: Received: 1 January 2022 / Approved: 4 January 2022 / Online: 4 January 2022 (11:12:43 CET)
How to cite:
Yin, Z.; Liang, Y.; Ren, J.; An, J. LETR: An End-to-End Detector of Reconstruction Area in Blade’s Adaptive Machining with Transformer. Preprints2021, 2021090332
Yin, Z.; Liang, Y.; Ren, J.; An, J. LETR: An End-to-End Detector of Reconstruction Area in Blade’s Adaptive Machining with Transformer. Preprints 2021, 2021090332
Cite as:
Yin, Z.; Liang, Y.; Ren, J.; An, J. LETR: An End-to-End Detector of Reconstruction Area in Blade’s Adaptive Machining with Transformer. Preprints2021, 2021090332
Yin, Z.; Liang, Y.; Ren, J.; An, J. LETR: An End-to-End Detector of Reconstruction Area in Blade’s Adaptive Machining with Transformer. Preprints 2021, 2021090332
Abstract
In the leading/trailing edge’s adaptive machining of the near-net-shaped blade, a small portion of the theoretical part is retained for securing aerodynamic performance by manual work. However, this procedure is time-consuming and depends on the human experience. In this paper, we defined retained theoretical leading/trailing edge as the reconstruction area. To accelerate the reconstruction process, an anchor-free neural network model based on Transformer was proposed, named LETR (Leading/trailing Edge Transformer). LETR extracts image features from an aspect of mixed frequency and channel domain. We also integrated LETR with the newest meta-Acon activation function. We tested our model on the self-made dataset LDEG2021 on a single GPU and got an mAP of 91.9\%, which surpassed our baseline model, Deformable DETR by 1.1\%. Furthermore, we modified LETR’s convolution layer and named the new model after GLETR (Ghost Leading/trailing Edge Transformer) as a lightweight model for real-time detection. It is proved that GLETR has fewer weight parameters and converges faster than LETR with an acceptable decrease in mAP (0.1\%) by test results.
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.