Monday, October 24, 2011

Paper Reading #23: User-defined motion gestures for mobile interaction

User-Defined Motion Gestures for Mobile Interaction



Authors - Jaime Ruiz, Yang Li, and Edward Lank


Authors Bios - Jaime Ruiz is a PhD student at the University of Waterloo and has been a visiting researcher at the Palo Alto Research Center.
Yang Li is a researcher at Google and earned his PhD from the Chinese Academy of Sciences.
Edward Lank is an Assistant Professor at the University of Waterloo and has a PhD from Queen's University.


Venue - This paper was presented at the CHI '11 Proceedings of the 2011 annual conference on Human factors in computing systems.


Summary


Hypothesis - In this paper, researchers state that not enough is known about motion gestures for designers to create intuitive input methods so they conduct a study to see what people view as being useful for certain tasks. The hypothesis is that the researchers will develop a standard for motion gestures that is useful to mobile developers and hardware makers as well as the users that will be able to finally use motion gestures in an intuitive way.


Methods - To develop a set of motion gestures that are intuitive to certain tasks, the researchers conducted a study of 20 people. The participants were asked to give a reasonable gesture to perform specific tasks. Tasks were divided into action and navigation categories and then further divided into system and application subcategories within each of the two broader categories. Participants were selected from people that listed using a smartphone as their primary mobile device. The screen was locked and had special software on it that recorded movements but did not give any feedback that could sway user perceptions. The participants were told how to perform the study and then given a quick survey for their results and an interview.  


Results - 4 common themes were found in the study and were:

  • Mimic Normal Use - A majority of participants preferred natural gestures for common tasks that already involved that motion (putting phone to ear)
  • Real-World Metaphors - Gesturing the phone as a non-mobile phone object and using it appropriately (hanging a phone up by turning it face down)
  • Natural and Consistent Mappings - doing what users expect (right for one thing and left for the opposite)
  • Providing Feedback - confirmation of actions happening and during the action.
The researchers described the 380 gestures collected by their gesture mappings and physical characteristics. Finally the researchers combined all of the results and formed a representative mapping diagram showing actions and functions.





Content - This user-defined set of gestures has many implications that the researchers discussed. First of all is supporting a standard set of gestures like the ones described in this study on all platforms so as to establish consistency. Adjusting how mobile OS's recognize gestures would also be beneficial in accomplishing this because the gestures should always be recognized without fail. 


Conclusion - The researchers conclude by stating that they still need to do some follow up research that establish these gestures as easily understood by all culture groups and age ranges. This research shows that their exists a broad agreement on some of these gestures performing the actions proposed here meaning that they would be accepted fast and used for greater efficiency. 


Discussion


I think the researchers achieved their goal of developing gestures for common tasks that can be widely accepted by many people and proved their place in the grand scheme of things such as how system and software design can be easily modified to accept these gestures. I think this study was interesting because it points out a gaping hole in the current line of mobile software that should have a standard by now.

No comments:

Post a Comment