Affect Recognition of Human Players in VR Games

Recent research has shown that people can perceive which affective state one is in simply by looking at body movements (without facial expression). Because of this, it has been possible to train machine learning algorithms to automatically recognize the affect of users from their body movements, to be used in human-computer interaction. This project consists in trying to adapt and deploy the same type of algorithms in a VR environment, where only partial movement information is available (hands and head positions of the user). A collaboration between the Metacreation Lab for Creative AI and Vancouver-based VR studio Inscape, this project deliverable is an AI that can predict the affective state of the VR user in realtime.

Intern: 
Mirjana Prpa
Faculty Supervisor: 
Philippe Pasquier
Province: 
British Columbia
Partner: 
Partner University: 
Program: