A Framework for Robust and Flexible Handling of Inputs with Uncertainty
Julia Schwarz, Scott Hudson, Jennifer Mankoff, Andrew D. Wilson
Julia Schwarz is a PhD student at Carnegie Mellon University studying human-computer interaction.
Scott Hudson is a professor at Carnegie Mellon University and earned a PhD from the University of Colorado.
Jennifer Mankoff is an associate professor at Carnegie Mellon University and has a PhD from the Georgia Institute of Technology.
Andrew D. Wilson is a researcher at Microsoft Research and has a PhD from MIT Media Laboratory.
This paper was presented at the UIST '10 Proceedings of the 23rd annual ACM symposium on User interface software and technology.
Summary
In this paper, researchers describe how using simple GUI concepts are not good enough when dealing with touch input especially when attempting to predict what action is actually being performed. The hypothesis proposed is that a framework can be made, that flexibly handles input and treats it as another form of instruction, to improve accuracy among communicating what a user is trying to do and what action is actually being performed.
The framework developed by the researchers is primarily concerned with event dispatch, what item is an event being sent to and what action is being requested by the even, as it relates to uncertain input. Event dispatching is decided by a mediator which can execute the most probable action, take no action, or ask for more clarification from the user.
A study of 6 example uses is presented next to validate that the framework works as desired. The examples show the simple application of the researchers' toolkit to common ambiguous inputs that touch interfaces are presented with such as uncertain item selection and uncertain action requests.
The researchers conclude that their toolkit can be easily applied to touch interfaces and will help as the industry continues to add support for mobile devices that often have small screens and small elements on screen.
Discussion
I think the researchers achieve their goal of supplying a better way to handle uncertain touch gestures but they fail to present us with any data on the subject such as how much of a problem this really is nor do they conduct a study to see what users would think of such improvements.
This paper is interesting because it attempts to provide a solution to a problem that has only recently become an issue due to the rise in mobile platforms. I think future work in this field is questionable due to the fact that interface design can fix many of these problems within individual apps. This research may have more impact in a more rounded environment such as a desktop OS that commonly has many applications running at a time within close proximity to each other but touch input is not a significant part of that market yet.

No comments:
Post a Comment