Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Leveraging the Sensitivity of Plants with Deep Learning to Recognize Human Emotions

Version 1 : Received: 19 February 2024 / Approved: 19 February 2024 / Online: 20 February 2024 (06:19:31 CET)

A peer-reviewed article of this Preprint also exists.

Kruse, J.A.; Ciechanowski, L.; Dupuis, A.; Vazquez, I.; Gloor, P.A. Leveraging the Sensitivity of Plants with Deep Learning to Recognize Human Emotions. Sensors 2024, 24, 1917. Kruse, J.A.; Ciechanowski, L.; Dupuis, A.; Vazquez, I.; Gloor, P.A. Leveraging the Sensitivity of Plants with Deep Learning to Recognize Human Emotions. Sensors 2024, 24, 1917.

Abstract

Recent advances in artificial intelligence combined with behavioral sciences have led to the development of cutting-edge tools for recognizing human emotions based on text, video, audio, and physiological data. However, these data sources are expensive, intrusive, and regulated, unlike plants which have been shown to be sensitive to human steps and sounds. A methodology is proposed to use plants as human emotion detectors. Electrical signals from plants are tracked and labeled based on video data. Labeled data are then used for classification. MLP, biLSTM, MFCC-CNN, MFCC-ResNet, and additionally, Random Forests, 1-Dimensional CNN, and biLSTM without windowing models are set using a grid search algorithm with cross-validation. Finally, best-parameterized models are trained and used with the test set for classification. The performance of this methodology is measured with a case study with 54 participants, who were watching an emotionally charged video, as ground truth their facial emotions were simultaneously measured using face emotion analysis. The Random Forest model shows the best performance, particularly in recognizing high-arousal emotions, achieving an overall weighted accuracy of 55.2% and demonstrating high weighted recall in emotions such as fear 61.0% and happiness 60.4%. The MFCC-ResNet model offers decently balanced results, with Accuracy(MFCC-ResNet)=0.318 and Recall (MFCC-ResNet)=0.324. With the MFCC-ResNet model fear and anger are recognized with 75% and 50% recall respectively. Thus, using plants as an emotion recognition tool seems worth investigating, addressing both cost and privacy concerns.

Keywords

Emotion recognition; Artificial intelligence; Deep Learning; Plant sensor; Classification; Emotion models

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.