Previous studies to recognize negative emotions (e.g. disgust, fear, sadness) for mental health care have used heavy equipment directly attaching electroencephalogram (EEG) electrodes to the head, making it difficult to use in daily life, and they have proposed binary classification methods to determine whether negative emotion or not. To tackle this problem, we propose a negative emotion recognition system to collect multimodal biosignal data such as five EEG signals in an EEG headset and heart rate, galvanic skin response, and skin temperature in a smart band for classifying multiple negative emotions. It consists of android Internet of Things (IoT) application, an oneM2M-compliant IoT server, and a machine learning server. The android IoT application upload the biosignal data to the IoT server. By using the biosignal data stored in the IoT server, the machine learning server recognizes the negative emotions of disgust, fear, and sadness using a multi-class support vector machine (SVM) model with a radial basis function kernel (RBF). The experimental results showed that the multi-class SVM model achieved 93% accuracy when considering all the multimodal biosignal data. Moreover, when considering only data in the smart band, it could achieve 98% accuracy by optimizing the hyper-parameter of the RBF kernel.