As UX practitioners, we use different methods to evaluate and measure the user experience. Eye tracking, for one, can help us understand how people consume visual information, which is a key aspect of interacting with most interfaces. Based on information obtained, we can detect usability problems or quantify the effectiveness of designs.
In addition to being used for research and diagnostic purposes, eye trackers can facilitate human-computer interaction, allowing users to control an interface with their eyes. Gaze-controlled systems have been assisting people with disabilities for several years, but with the technology becoming more affordable and portable, gaze control seems to be ready for a wider audience.
This year at the International Consumer Electronics Show (CES 2013), Tobii, a leading creator of eye tracking systems, unveiled the Tobii REX, the world’s first eye tracking peripheral for the general consumer market. The device, along with the Tobii Gaze software, met many positive responses at CES and beyond.
At first I was skeptical about eye-controlled interactions. I expected my eyes to do things to which my computer should not react. In the eye tracking world, this is a concept often referred to as the “Midas touch problem,” which means that everywhere the user looks, a new function is activated. Activations may be unwanted if the eye movements were involuntary or meant only to gather information.
The Tobii REX demo at CES overcame this dilemma by combining the gaze focus with a manual input (e.g., a button press or a mouse click) used to indicate selection. The outcome seemed to be very accurate. After a quick calibration, the system tracked my eyes and shifted the focus on the computer. It was as if my eyes placed the mouse cursor over something on the screen. I had to just look, then click or scroll. I selected Windows 8 menu items, shot asteroids in a game, zoomed in on Chicago on a map, and did not experience any incorrect selections.
After my positive experience at CES, I wanted to use Tobii REX to control my daily use of mouse-heavy design software. With my background in ergonomics, I see potential for this system to mitigate repetitive motion disorders, such as carpal tunnel syndrome. Specialized workers, such as healthcare practitioners in sterile environments or factory workers wearing bulky gloves, may also benefit from gaze input.
Where this technology goes now depends largely on what interface designers and developers do with it when integrating it with their applications. Tobii REX’s success may be determined by its accuracy, usefulness, and ease of use. Since accuracy and usefulness both seem promising, usability of the end products should be where we focus our attention next.
Melinda Jamil is a Senior User Experience Specialist at GfK User Centric. She has led a range of projects from international healthcare-related usability studies to user-centered interface design for handheld devices and kiosk screens. To reach Melinda, please email email@example.com.
(Originally posted on the GfK User Centric “UX Nuggets” blog.)