Gesture search: a tool for fast mobile data access
Yang Li
Yang Li is a researcher at Google and earned his PhD from the Chinese Academy of Sciences.
Summary
In this paper, the researcher creates a program that recognizes gestures and searches data on mobile devices based on said gestures. The hypothesis is that using such a system can improve access time and makes navigating the ever increasingly complex mobile interfaces quicker and easier than ever.
Next, the researcher discusses the basics of how Gesture Search works. A user gestures a letter and search results begin appearing ranked in order of most commonly accessed. The user can then continue entering more gestures to refine the search or select the desired data and perform a task on it such as calling a contact or opening an application.
The researcher then described many of the interactions with the device such as discerning GUI input (scrolling, tap selection) from gesture input, searching from a gesture query, and using the history to display better results. Telling GUI input apart from gesture input was difficult at first but resolved after a study was done involving users that helped develop a model for recognizing scrolling vs. gesture input quickly. Searching using a gesture query is extremely similar to any other searching done on mobile devices as search results change based on new input entered in real time. Using access frequency consists of storing the frequency in a local container and then applying a ranking algorithm that takes this information into account when displaying results.
Lastly, the researcher performed a study using employees at a specific company and had them use the app in their daily lives. The study found that a majority data access was in the form of contacts and applications and that most queries consisted of between 1 and 3 characters to get to the desired data. The survey conducted at the end of the study showed that the participants enjoyed the product and found it useful.
Discussion
I think the researcher achieved his goal of providing a quicker way to access data using gestures. I think letter recognition may not be the best way to do this though because, typing the first 3 letters of an application can be just as fast if not faster than any kind of gesture recognition that has a lot of wait time, in my experience, between letters. The aspect of this product that interested me the most was user-defined gestures which were barely mentioned because those could lead to consistent one-gesture use cases.
No comments:
Post a Comment