Islam, S.S.; Dey, E.K.; Tawhid, M.N.A.; Hossain, B.M.M. Garments Texture Design Class Identification Using Deep Convolutional Neural Network. Preprints2016, 2016070085. https://doi.org/10.20944/preprints201607.0085.v1
Islam, S.S., Dey, E.K., Tawhid, M.N.A., & Hossain, B.M.M. (2016). Garments Texture Design Class Identification Using Deep Convolutional Neural Network. Preprints. https://doi.org/10.20944/preprints201607.0085.v1
Islam, S.S., Md. Nurul Ahad Tawhid and B. M. Mainul Hossain. 2016 "Garments Texture Design Class Identification Using Deep Convolutional Neural Network" Preprints. https://doi.org/10.20944/preprints201607.0085.v1
Automatic garments design class identification for recommending the fashion trends is important nowadays because of the rapid growth of online shopping. By learning the properties of images efficiently, a machine can give better accuracy of classification. Several methods, based on Hand-Engineered feature coding exist for identifying garments design classes. But, most of the time, those methods do not help to achieve better results. Recently, Deep Convolutional Neural Networks (CNNs) have shown better performances for different object recognition. Deep CNN uses multiple levels of representation and abstraction that helps a machine to understand the types of data (images, sound, and text) more accurately. In this paper, we have applied deep CNN for identifying garments design classes. To evaluate the performances, we used two well-known CNN models AlexNet and VGGNet on two different datasets. We also propose a new CNN model based on AlexNet and found better results than existing state-of-the-art by a significant margin.
CNN; Deep Learning; AlexNet; VGGNet; Texture Descriptor; Garment Categories; 13 Garment Trend Identification; Design Classification for Garments.
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.