Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

A Comprehensive Comparative Analysis of Deep Learning based Feature Representations for Molecular Taste Prediction

Version 1 : Received: 28 August 2023 / Approved: 28 August 2023 / Online: 29 August 2023 (03:06:30 CEST)

A peer-reviewed article of this Preprint also exists.

Song, Y.; Chang, S.; Tian, J.; Pan, W.; Feng, L.; Ji, H. A Comprehensive Comparative Analysis of Deep Learning Based Feature Representations for Molecular Taste Prediction. Foods 2023, 12, 3386. Song, Y.; Chang, S.; Tian, J.; Pan, W.; Feng, L.; Ji, H. A Comprehensive Comparative Analysis of Deep Learning Based Feature Representations for Molecular Taste Prediction. Foods 2023, 12, 3386.

Abstract

Taste determination in small molecules is critical in food chemistry, but traditional experimental methods can be time-consuming. Consequently, computational techniques have emerged as val-uable tools for this task. In this study, we explore taste prediction using various molecular feature representations and assess the performance of different machine learning algorithms on a dataset comprising 2,601 molecules. The results reveal that GNN-based models outperform other ap-proaches in taste prediction. Moreover, consensus models that combine diverse molecular repre-sentations demonstrate improved performance. Among these, molecular fingerprints + GNN con-sensus model emerges as the top performer, highlighting the complementary strengths of GNNs and molecular fingerprints. These findings have significant implications for food chemistry research and related fields. By leveraging these computational approaches, taste prediction can be expedited, leading to advancements in understanding the relationship between molecular structure and taste perception in various food components and related compounds.

Keywords

Cheminformatics; Taste prediction; Machine learning; Deep learning; Molecular feature representation

Subject

Chemistry and Materials Science, Food Chemistry

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.