Affect Recognition of Human Players in VR Games

Recent research has shown that people can perceive which affective state one is in simply by looking at body movements (without facial expression). Because of this, it has been possible to train machine learning algorithms to automatically recognize the affect of users from their body movements, to be used in human-computer interaction. This project consists in trying to adapt and deploy the same type of algorithms in a VR environment, where only partial movement information is available (hands and head positions of the user).

Proximity Music System

Video games are highly interactive experience. With modern approaches to game design it becomes impossible to anticipate all the situations a player can ended up in. Building up on research regarding generative music systems at the Metacreation Lab for Creative AI, PhD student Cale Plutt will research, design and implement a generative music system tailored to Inscape VR game. In particular, the music generation will respond to affective state of the games in terms of their valence, arousal and tension.

Machine-Learning-Based Artistic Photo Manipulation and Stylization on Mobile Devices

Recent advances in using machine learning for object recognition and image manipulation have resulted in a new and emerging market for mobile applications that use machine learning for creating a variety of new artistic expressions. This research will develop a framework for performing machine-learning-based photo and video manipulation on mobile devices with the goal of integrating it with the Generate Toolkit. This proposal follows previous MITACS internships between the same partners and further extends our objectives.

Understanding Real-time Particle Systems for Health, Entertainment and VR

The proposed research is a collaboration between Persistant Studios’ PopcornFX and SFU’s iVizLab to collaboratively work on ways to understand the processes involved in content creation using a real-time particle system. The iVizLab’s research focuses on using real-time visuals with the biodata from the users as one of the main interfaces to create affective systems that can intelligently interact with the users. In creating the visuals for the iVizLab, it is important to be able to create content that can be modified in real-time with the incoming data.

Molecular-Based Analytics for Prediction and Optimization of Performance

Recent advancements in the study of large biological datasets coupled with powerful machine learning tools and new analytical technologies have opened up the ability to predict risks to health and possibly intervene before disease has manifested. Molecular You has developed a web based interface that integrates all these technologies into easy to understand visual health report for consumers. However, many consumers do not need to worry about health risks, as they are relatively healthy and do not have any health risks identified by our platform.

MovingStories Interdisciplinary Research Cohort

MovingStories Research: Science and Art Collaboration

The MovingStories research partnership explores the design of digital tools for movement, meaning and interaction. Human movement is ubiquitous but is also complex – it is simultaneously expressive, communicative, functional and intelligent. MovingStories is an interdisciplinary research project that brings together scientists and artists in computing science, dance, music, theatre, machine learning, cognitive science, psychology, health, game design and virtual reality.


DeepCity is a strategy game initially for iOS, Android and desktop that is set in a strange, fascinating near future of mega-storms and scarcity. The goal: Survive. Defend. Regenerate. DeepCity advances concepts of resilience, urban ecology, sustainable futures, and systems-thinking within urban environments for players. The DeepCity prototype research project will provide the student with an opportunity to be creatively engaged in researching and making a new game for change, benefitting from mentorship and collaboration with accomplished industry professionals and academic researchers.

Music As a Fundamental Element of Gameplay Interaction

Music is a fundamental part of human culture and day-to-day lives, but largely exists only as an accent in modern games. While arguably music can deepen and intensify in-game experiences, music is rarely an integrated part of the available mechanics in game worlds. Unlike any other MMORPG or music-based game currently available, this project will look at the social and communicative elements of music that can be leveraged in game as a social network and primary game mechanic.

The Social and Technical Effects of Device and Display Configurations on Telemedicine

Telemedicine systems can connect health practitioners and patients with specialists in geographically distant regions. Yet we still do not have a strong understanding of how such systems are used and how they support or impede the workflow of health professionals. This research explores how nurses, general practitioners, and specialists make use of telemedicine systems and how the portability of these systems affects usage.

Visite virtuelle du modèle 3D interactif du monastère des Ursulines de Québec

Le projet consiste à créer un modèle 3D interactif du monastère des Ursulines de Québec pour une diffusion Web afin de mettre en valeur l’héritage des religieuses ainsi que son monastère (considéré comme un bien patrimonial national). Notre projet est avant-gardiste parce qu’il propose une libre navigation dans le modèle aux internautes. En effet, la reconstitution est construite sur une plate-forme jeu (gaming engine) permettant une meilleure fluidité et interactivité, tout en respectant l’authenticité des lieux réels.