Authors - Andrew Bragdon, Eugene Nelson, Yang Li, and Ken Hinckle
Authors Bios - Andrew Bragdon is a second year PhD student at Brown University focusing on human-computer interaction.
Eugene Nelson is a PhD student at Brown University.
Yang Li is a researcher at Google and earned his PhD from the Chinese Academy of Sciences.
Ken Hinckley is a Principal Researcher at Microsoft Research and has a PhD from the University of Virginia.
Venue - This paper was presented at the CHI '11 Proceedings of the 2011 annual conference on Human factors in computing systems.
Summary
Hypothesis - In this paper, researchers question the effectiveness of soft buttons on touch screens and propose the use of bezel-initiated gestures as a viable alternative for developers and designers to use. The hypothesis is that soft buttons will result in more errors than gestures in non-ideal settings (distractions present) and reduce users' awareness of their surroundings.
Content - Previous studies have shown that users prefer to use one hand to interact with a phone so all gestures tested support the one-handed approach easily. The 4 factors examined were:
- Moding Technique - This includes Hard Buttons (Easily found due to material used and provided tactile feedback), Bezel Gestures (Initiated by beginning gesture very close to edge of screen and then recognized as input), and Soft Buttons (Standard sized buttons that have a single random black character in the middle)
- Gesture Type - Divided into Mark-Based (Straight strokes aligned with various axes) and Free-Form (Can be circular or letters)
- User's Motor Activity - Possibly sitting/standing (No difference found between the 2) or walking (Using a treadmill)
- Distraction Level of Environment - The 3 levels of distraction are no distraction (User down not need to look anywhere else), moderate situational awareness task (User can only glance at the phone in between tasks), and attention-saturating task (Users could not look at phone at all during continuous task).
Methods - 15 participants were recruited to take part in the study. 2 Android phones were used in the study, one modified with the hard button and one standard. For each environment (Sitting/No Distraction, Sitting/Moderate Situational Awareness Task, Walking/Moderate Situational Awareness Task, and Sitting/Attention-Saturating Task), users performed 2 training and 6 regular blocks of 12 commands and took a questionnaire at the end of the study.
Results - The following was found:
- A significant difference in time for the different environments existed
- Bezel Gestures were found to be quicker than the other 2 methods of input
- Bezel Gestures performance gains were more noticeable in the environments where distractions were present
- Bezel Gestures were unaffected by change in environment but soft buttons saw significant degradation in performance when the user could not see the screen constantly
- Accuracy was similar across all modes but free-form paths showed significantly more errors than marks
- Gesture marks were preferred when not sitting with no distractions in which case soft buttons was the favorite
Conclusion - The researchers conclude by stating they think bezel-initiated, mark-based general shortcuts should be provided where soft buttons are traditionally used citing the lack of change between performance of the 2 in a still environment but a tremendous advantage for bezel gestures when full attention cannot be given to a phone.
Discussion
I think the researchers achieved their goal and showed bezel gestures as a viable alternative to soft buttons. I think anything that can remove the need for soft buttons is a good thing as screen space can be saved and user attention is not necessary. It was interesting to see just how similar the 2 modes were in the still environment while the gestures were staggeringly better when full attention was not give to the phone.
No comments:
Post a Comment