Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

A General Machine Learning Model for Assessing Fruit Quality Using Deep Image Features

Version 1 : Received: 20 July 2023 / Approved: 21 July 2023 / Online: 24 July 2023 (03:09:22 CEST)

How to cite: Apostolopoulos, I.D.; Tzani, M.; Aznaouridis, S. A General Machine Learning Model for Assessing Fruit Quality Using Deep Image Features. Preprints 2023, 2023071552. https://doi.org/10.20944/preprints202307.1552.v1 Apostolopoulos, I.D.; Tzani, M.; Aznaouridis, S. A General Machine Learning Model for Assessing Fruit Quality Using Deep Image Features. Preprints 2023, 2023071552. https://doi.org/10.20944/preprints202307.1552.v1

Abstract

Fruit quality is a critical factor in the produce industry, affecting producers, distributors, consumers, and the economy. High-quality fruits are more appealing, nutritious, and safe, boosting consumer satisfaction and revenue for producers. Artificial Intelligence can aid in assessing the quality of the fruit using images. This paper presents a general machine-learning model for assessing fruit quality using deep image features. The model leverages the learning capabilities of the recent successful networks for image classification called Vision Transformers (ViT). The ViT model is built and trained with a combination of various fruit datasets and learned to distinguish between good and rotten fruit images. The general model demonstrated impressive results in accurately identifying the quality of various fruits such as Apples (with a 99.50% accuracy), Cucumbers (99%), Grapes (100%), Kakis (99.50%), Oranges (99.50%), Papayas (98%), Peaches (98%), Tomatoes (99.50%), and Watermelons (98%). However, it showed slightly lower performance in identifying Guavas (97%), Lemons (97%), Limes (97.50%), mangoes (97.50%), Pears (97%), and Pomegranates (97%).

Keywords

Fruit Quality; Machine Learning; Deep Learning

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.