Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Garments Texture Design Class Identification Using Deep Convolutional Neural Network

Version 1 : Received: 27 July 2016 / Approved: 27 July 2016 / Online: 27 July 2016 (15:39:53 CEST)

How to cite: Islam, S.S.; Dey, E.K.; Tawhid, M.N.A.; Hossain, B.M.M. Garments Texture Design Class Identification Using Deep Convolutional Neural Network. Preprints 2016, 2016070085. https://doi.org/10.20944/preprints201607.0085.v1 Islam, S.S.; Dey, E.K.; Tawhid, M.N.A.; Hossain, B.M.M. Garments Texture Design Class Identification Using Deep Convolutional Neural Network. Preprints 2016, 2016070085. https://doi.org/10.20944/preprints201607.0085.v1

Abstract

Automatic garments design class identification for recommending the fashion trends is important nowadays because of the rapid growth of online shopping. By learning the properties of images efficiently, a machine can give better accuracy of classification. Several methods, based on Hand-Engineered feature coding exist for identifying garments design classes. But, most of the time, those methods do not help to achieve better results. Recently, Deep Convolutional Neural Networks (CNNs) have shown better performances for different object recognition. Deep CNN uses multiple levels of representation and abstraction that helps a machine to understand the types of data (images, sound, and text) more accurately. In this paper, we have applied deep CNN for identifying garments design classes. To evaluate the performances, we used two well-known CNN models AlexNet and VGGNet on two different datasets. We also propose a new CNN model based on AlexNet and found better results than existing state-of-the-art by a significant margin.

Keywords

CNN; Deep Learning; AlexNet; VGGNet; Texture Descriptor; Garment Categories; 13 Garment Trend Identification; Design Classification for Garments.

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.