Review
Version 1
Preserved in Portico This version is not peer-reviewed
Depression detection model using multimodal deep learning
Version 1
: Received: 8 May 2023 / Approved: 9 May 2023 / Online: 9 May 2023 (13:20:58 CEST)
How to cite: Yoo, H.; Oh, H. Depression detection model using multimodal deep learning. Preprints 2023, 2023050663. https://doi.org/10.20944/preprints202305.0663.v1 Yoo, H.; Oh, H. Depression detection model using multimodal deep learning. Preprints 2023, 2023050663. https://doi.org/10.20944/preprints202305.0663.v1
Abstract
This study compares the performance of existing studies on multimodal emotion recognition, and proposes a model that fuses two modalities with the speaker's text and voice signals as input values and detects depression. Based on the DAIC-WOZ dataset, voice features were extracted using CNN, text features were extracted using Transformers, and two modalities were fused through a tensor fusion network. We also build a model to detect whether the speaker is depressed or not using LSTM in the final layer. This study suggests the possibility of increasing access to mental illness diagnosis by enabling patients to detect depression on their own in daily conversations. If the model proposed in this study is developed and the voice conversation system is connected, it will be easier for patients who cannot visit the hospital periodically or who are reluctant to visit the hospital to check their condition and seek recovery. Furthermore, it can be expanded to multi-label classification for various mental diseases and used as a simple self-mental disease diagnosis tool.
Keywords
depression detection; fusion; feature extraction; deep learning
Subject
Computer Science and Mathematics, Mathematical and Computational Biology
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment