Version 1
: Received: 28 April 2023 / Approved: 28 April 2023 / Online: 28 April 2023 (08:30:10 CEST)
How to cite:
Kono, M.; Isoyama, N.; Uchiyama, H.; Sakata, N.; Takamatsu, J.; Kiyokawa, K. A Study on Animacy and Emotion Perception from Vertical Undulatory Motion of Curved Surfaces. Preprints2023, 2023041146. https://doi.org/10.20944/preprints202304.1146.v1
Kono, M.; Isoyama, N.; Uchiyama, H.; Sakata, N.; Takamatsu, J.; Kiyokawa, K. A Study on Animacy and Emotion Perception from Vertical Undulatory Motion of Curved Surfaces. Preprints 2023, 2023041146. https://doi.org/10.20944/preprints202304.1146.v1
Kono, M.; Isoyama, N.; Uchiyama, H.; Sakata, N.; Takamatsu, J.; Kiyokawa, K. A Study on Animacy and Emotion Perception from Vertical Undulatory Motion of Curved Surfaces. Preprints2023, 2023041146. https://doi.org/10.20944/preprints202304.1146.v1
APA Style
Kono, M., Isoyama, N., Uchiyama, H., Sakata, N., Takamatsu, J., & Kiyokawa, K. (2023). A Study on Animacy and Emotion Perception from Vertical Undulatory Motion of Curved Surfaces. Preprints. https://doi.org/10.20944/preprints202304.1146.v1
Chicago/Turabian Style
Kono, M., Jun Takamatsu and Kiyoshi Kiyokawa. 2023 "A Study on Animacy and Emotion Perception from Vertical Undulatory Motion of Curved Surfaces" Preprints. https://doi.org/10.20944/preprints202304.1146.v1
Abstract
It is known that people perceive animacy in objects. However, many studies on animacy and emotional expressions are limited in that the investigated motions were created by experimenters themselves. This makes the objective validity unclear. Moreover, it remains unclear what types of movements can express emotions with animacy due to the limited number of investigations examining both animacy and emotional expressions. Therefore, we investigated the motion elements for both animacy perception and emotional expressions using simple objects that lack features of specific living things, such as eyes, ears, tails, and voices in this study. First, we investigated the motion elements for animacy perception and emotional expressions using a robot simulator that enabled participants to create undulatory motions by tuning parameters for speed, height, and randomness. In total, 64 participants created motions in Normal (neutral), Joy, Sad, Relaxed, and Angry conditions. The results showed that the medians of speed and height in Normal, related only to animacy, were 0.5569[Hz] and 3.050cm at the edges/4.575cm at the center. The differences in Joy were 0.4028[Hz] and 3.348cm/5.022cm, in Sad were −0.1652[Hz] and −0.9982cm/−1.497cm, in Relaxed were −0.1979[Hz] and −0.4902cm/−0.7353cm, and in Angry were 0.5212[Hz] and 4.688cm/7.032cm. Second, we investigated whether the motion elements revealed in the first experiment were sufficient to express emotions with animacy, using a robot simulator that reflected the results of the motion element investigation. In total, 44 online participants observed the simulator. The results showed that participants could understand emotional arousal levels at the same time as animacy, but they did not fully understand emotional valence. Our findings provide design guidelines for robots that exhibit emotional expressions and closely interact with humans.
Keywords
Human Robot Interaction; Cognition; Emotion; Animacy; Affective Engineering
Subject
Computer Science and Mathematics, Other
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.