Social psychological research indicates that bodily expressions convey important affective information, although this modality is relatively neglected in the literature as compared to facial expressions and speech. In this paper we propose a real-time system that continuously recognizes emotions from body movements data streams. Low-level 3D postural features and high-level kinematic and geometrical features are through summarization (statistical values) or aggregation (feature patches), fed to a random forests classifier. In a first stage, the MoCap UCLIC affective gesture database has been used for training the classifier, which led to an overall recognition rate of 78% using a 10-fold cross-validation (leave-one-out). Subsequently, the trained classifier was tested with different subjects using continuous Kinect data. A performance of 72% was reached in real-time, which proves the efficiency and effectiveness of the proposed system.
Wang, W, Enescu, V & Sahli, H 2013, Towards Real-Time Continuous Emotion Recognition from Body Movements. in AA Salah, H Hung, O Aran & H Gunes (eds), Lecture Notes in Computer Science. vol. 8212, 4th International Workshop on Human Behavior Undersdanding (HBU2013), Springer International Publishing, pp. 235-245, 4th International Workshop on Human Behavior Understanding, Barcelona, Spain, 22/10/13.
Wang, W., Enescu, V., & Sahli, H. (2013). Towards Real-Time Continuous Emotion Recognition from Body Movements. In A. A. Salah, H. Hung, O. Aran, & H. Gunes (Eds.), Lecture Notes in Computer Science (Vol. 8212, pp. 235-245). (4th International Workshop on Human Behavior Undersdanding (HBU2013)). Springer International Publishing.
@inproceedings{cffe528a56eb424e82644b619ece571b,
title = "Towards Real-Time Continuous Emotion Recognition from Body Movements",
abstract = "Social psychological research indicates that bodily expressions convey important affective information, although this modality is relatively neglected in the literature as compared to facial expressions and speech. In this paper we propose a real-time system that continuously recognizes emotions from body movements data streams. Low-level 3D postural features and high-level kinematic and geometrical features are through summarization (statistical values) or aggregation (feature patches), fed to a random forests classifier. In a first stage, the MoCap UCLIC affective gesture database has been used for training the classifier, which led to an overall recognition rate of 78% using a 10-fold cross-validation (leave-one-out). Subsequently, the trained classifier was tested with different subjects using continuous Kinect data. A performance of 72% was reached in real-time, which proves the efficiency and effectiveness of the proposed system.",
keywords = "emotion recognition, real-time, bodily expressions",
author = "Weiyi Wang and Valentin Enescu and Hichem Sahli",
note = "Albert Ali Salah, Hayley Hung, Oya Aran, Hatice Gunes; 4th International Workshop on Human Behavior Understanding, HBU 2013 ; Conference date: 22-10-2013 Through 22-10-2013",
year = "2013",
month = oct,
day = "22",
language = "English",
isbn = "978-3-319-02713-5",
volume = "8212",
series = "4th International Workshop on Human Behavior Undersdanding (HBU2013)",
publisher = "Springer International Publishing",
pages = "235--245",
editor = "Salah, {Albert Ali} and Hayley Hung and Oya Aran and Hatice Gunes",
booktitle = "Lecture Notes in Computer Science",
}