Since the 1990s, gaze technology - or eye tracking - has helped people with conditions such as motor neurone disease (MND), cerebral palsy and other ;locked-in syndromes' to control 2D desktop environments and communicate using visual keyboards.
Users typically guide a cursor with their eyes, staring at objects for a time to emulate a mouse click. But that is too laborious to let users to match the speed and accuracy of real-time 3D games, say researchers on the project at University, Leicester, UK. The team are developing the software as part of the EU-funded project Communication by Gaze Interaction (COGAIN).
Even though a user in, say, Second Life might look as if they are able-bodied, if they can't operate and communicate as fast as everyone else, they could be perceived as having a disability, hence privacy issues for players who may prefer not to reveal their disability in the virtual world.
In virtual worlds, gamers need to perform a whole suite of commands including moving their character or avatar, altering their viewpoint on the scene, manipulating objects and communicating with other players.
Eye-gaze systems bounce infrared light from LEDs at the bottom of a computer monitor and track a person's eye movements using stereo infrared cameras. This setup can calculate where on a screen the user is looking with an accuracy of about 5 mm.
A ‘gaze gesture’ is also built in to temporarily turn off the eye-gaze functions altogether, to avoid unintentionally selecting an item while looking around the screen.
Clearly our eyes are perceptual organs, not designed for pointing and selecting. You can't turn them off, like you can lift your hand off the mouse."
Enabling someone to express themselves and engage with people in ways that they can't do in real life - because they are restricted to a wheelchair or a bed - can have a really positive effect on their self-esteem and motivation.
Click here to watch video.