Preprint Article Version 2 This version is not peer-reviewed

Bringing Depth Data Alive: Conceive Human Intention through Web Visualisations of Head Pose and Emotion Changes

This paper is an extended version of our paper published in 8th Computer Science and Electronic Engineering (CEEC) [1].
Version 1 : Received: 31 May 2017 / Approved: 1 June 2017 / Online: 1 June 2017 (06:14:53 CEST)
Version 2 : Received: 7 July 2017 / Approved: 10 July 2017 / Online: 10 July 2017 (08:26:31 CEST)

A peer-reviewed article of this Preprint also exists.

Kalliatakis, G.; Stergiou, A.; Vidakis, N. Conceiving Human Interaction by Visualising Depth Data of Head Pose Changes and Emotion Recognition via Facial Expressions. Computers 2017, 6, 25. Kalliatakis, G.; Stergiou, A.; Vidakis, N. Conceiving Human Interaction by Visualising Depth Data of Head Pose Changes and Emotion Recognition via Facial Expressions. Computers 2017, 6, 25.

Journal reference: Computers 2017, 6, 25
DOI: 10.3390/computers6030025

Abstract

Affective computing in general and human activity and intention analysis in particular, is a rapidly growing field of research. Head pose and emotion changes, present serious challenges when applied to player’s training and ludology experience in serious games or analysis of customer satisfaction regarding broadcast and web services or monitoring a driver’s attention. Given the increasing prominence and utility of depth sensors, it is now feasible to perform large-scale collection of three-dimensional (3D) data for subsequent analysis. Discriminative random regression forests was selected in order to rapidly and accurately estimate head pose changes in unconstrained environment. In order to complete the secondary process of recognising four universal dominant facial expressions (happiness, anger, sadness and surprise), emotion recognition via facial expressions (ERFE) was adopted. After that, a lightweight data exchange format (JavaScript Object Notation-JSON) is employed, in order to manipulate the data extracted from the two aforementioned settings. Motivated by the need of generating comprehensible visual representations from different sets of data, in this paper we introduce a system capable of monitoring human activity through head pose and emotion changes, utilising an affordable 3D sensing technology (Microsoft Kinect sensor).

Subject Areas

human activity analysis; human intention understanding; affective computing; data visualisation; depth data; head pose estimation; emotion recognition

Readers' Comments and Ratings (0)

Leave a public comment
Send a private comment to the author(s)
Rate this article
Views 0
Downloads 0
Comments 0
Metrics 0
Leave a public comment

×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.