Browse by author
Lookup NU author(s): Professor Christian Kray, Dr Dan Nesbitt, John Dawson
Full text for this publication is not currently held within this repository. Alternative links are provided below where available.
Gestures can offer an intuitive way to interact with a computer. In this paper, we investigate the question whether gesturing with a mobile phone can help to perform complex tasks involving two devices. We present results from a user study, where we asked participants to spontaneously produce gestures with their phone to trigger a set of different activities. We investigated three conditions (device configurations): phone-to-phone, phone-to-tabletop, and phone to public display. We report on the kinds of gestures we observed as well as on feedback from the participants, and provide an initial assessment of which sensors might facilitate gesture recognition in a phone. The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control. © 2010 ACM.
Author(s): Kray C, Nesbitt D, Dawson J, Rohs M
Publication type: Conference Proceedings (inc. Abstract)
Publication status: Published
Conference Name: 12th international conference on Human computer interaction with mobile devices and services
Year of Conference: 2010
Pages: 239-248
Publisher: ACM Press
URL: http://dx.doi.org/10.1145/1851600.1851640
DOI: 10.1145/1851600.1851640
Library holdings: Search Newcastle University Library for this item
ISBN: 9781605588353