Ma, Y.; Liu, K.; Guan, Z.; Xu, X.; Qian, X.; Bao, H. Background Augmentation Generative Adversarial Networks (BAGANs): Effective Data Generation Based on GAN-Augmented 3D Synthesizing. Symmetry2018, 10, 734.
Ma, Y.; Liu, K.; Guan, Z.; Xu, X.; Qian, X.; Bao, H. Background Augmentation Generative Adversarial Networks (BAGANs): Effective Data Generation Based on GAN-Augmented 3D Synthesizing. Symmetry 2018, 10, 734.
Ma, Y.; Liu, K.; Guan, Z.; Xu, X.; Qian, X.; Bao, H. Background Augmentation Generative Adversarial Networks (BAGANs): Effective Data Generation Based on GAN-Augmented 3D Synthesizing. Symmetry2018, 10, 734.
Ma, Y.; Liu, K.; Guan, Z.; Xu, X.; Qian, X.; Bao, H. Background Augmentation Generative Adversarial Networks (BAGANs): Effective Data Generation Based on GAN-Augmented 3D Synthesizing. Symmetry 2018, 10, 734.
Abstract
Augment reality (AR) is crucial for immersive human-computer interaction (HCI) and vision of artificial intelligence (AI). Labeled data drove object recognition in AR. However, manual annotating data is expensive and labor-intensive, and furthermore, scanty labeled data limits the application of AR. Aiming at solving the problem of insufficient training data in AR object recognition, an automated vision data synthesis method called BAGAN is proposed in this paper based on the 3D modeling and GAN algorithm. Our approach has been validated to have better performance than other methods through image recognition task on natural image database ObjectNet3D. This study can shorten the algorithm development time of AR and expand the application scope of AR, which is of great significance to immersive interactive systems.
Keywords
object recognition; image data synthesizing; Human-computer interaction; data synthesizing for immersive HCI; generative adversarial nets; BAGAN
Subject
Computer Science and Mathematics, Computer Vision and Graphics
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.