Expert Gestures for Multi-touch Interaction

Modern smartphones and tablets, and many notebook computers rely on multitouch interaction to augment keyboard and mouse input. Multi-touch gestures typically consists of taps and swipes – simple gestures that don’t exploit the full range of technical and human capabilities. In earlier work, we determined that users are willing to learn expert-level gestures, but often find them difficult to discover and challenging to learn without formal training. However, when an expert version of a standard gesture is demonstrated, they are willing to adopt and use it.

Our proposed research explores two questions:
1. How can we make these gestures more discoverable, so that users are aware of them?
2. Once users are aware of them, are expert-gestures more effective at solving common tasks?

We will implement three expert-gestures which are improvements on standard pinch-to-zoom. We will run a controlled study where users need to (a) discoverability and notification mechanisms, to help users discover these gestures, and (b) use these gestures to perform common tasks (e.g. map navigation or drawing) to determine their effectiveness.

Faculty Supervisor:

Edward Lank

Student:

Jeff Avery

Partner:

Discipline:

Computer science

Sector:

University:

University of Waterloo

Program:

Globalink

Current openings

Find the perfect opportunity to put your academic skills and knowledge into practice!

Find Projects