Loughborough University
Leicestershire, UK
LE11 3TU
+44 (0)1509 263171
Loughborough University

Loughborough University Institutional Repository

Please use this identifier to cite or link to this item: https://dspace.lboro.ac.uk/2134/3016

Title: Environmental control by remote eye tracking
Authors: Shi, Fangmin
Gale, Alastair G.
Keywords: eye-tracking
environmental control
object recognition
disabled people
Issue Date: 2007
Publisher: © COGAIN
Citation: SHI, F. and GALE, A. G. (2007). Environmental control by remote eye tracking. IN: Proceedings of COGAIN 2007 : gaze-based creativity, interacting with games and on-line communities, 3rd-4th September, Leicester, pp. 49-52
Abstract: Eye movement interfacing can be found in some specially designed environmental control systems (ECSs) for people with severe disability. Typically this requires the user to sit in front of a computer monitor and their eye gaze direction is then detected which controls the cursor position on the screen. The ECS screen usually consists of a number of icons representing different controllable devices and an eye fixation landing within a pre-defined icon area then activates a selection for control. Such systems are widely used in homes, offices, schools, hospitals, and long-term care facilities. Wellings and Unsworth (1997) demonstrated that a user-friendly interface design is the weak link in ECS technology, in particular for severely disabled people. Disabled individuals need straightforward control of their immediate surroundings and so making a detailed menu selection by techniques, such as eye-screen interaction, can be a difficult and tedious process for some individuals. This situation can be exasperated by real-world issues such as eye tracking systems which do not tolerate user’s head movement. This paper presents a different approach to environmental control using eye gaze selection, in which the control options applicable to a given device are automatically pre-selected by means of the user directly looking at the device in their environment. This intuitive method therefore minimises the amount of navigation that the user must perform. To date, two main methods have been employed to achieve this direct eye-device control. The initial development using a head-mounted eye tracker was previously reported (Shi et al., 2006). This current paper describes subsequent development of the system (Shi et al., 2007) using a remote eye tracker which is simply situated before the user with no need for any attachment to them.
Description: This is a conference paper. The original version is available online at: http://www.cogain.org/cogain2007/COGAIN2007Proceedings.pdf
URI: https://dspace.lboro.ac.uk/2134/3016
Appears in Collections:Conference Papers (Computer Science)

Files associated with this item:

File Description SizeFormat
PUB510 Environmental control by remote eye tracking.pdf133.76 kBAdobe PDFView/Open

 

SFX Query

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.