Exploring Eye-Gaze Wheelchair Control
In Acm Symposium on Eye Tracking Research and Applications, 2020
Abstract
Eye-gaze may potentially be used for steering wheelchairs or robots and thereby support independence in choosing where to move. This paper investigates the feasibility of gaze-controlled interfaces. We present an experiment with wheelchair control in a simulated, virtual reality (VR) driving experiment and a field study with five people using wheelchairs. In the VR experiment, three control interfaces were tested by 18 able-bodied subjects: (i) dwell buttons for direction commands on an overlay display, (ii) steering by continuous gaze point assessment on the ground plane in front of the driver, and (iii) waypoint navigation to targets placed on the ground plane. Results indicate that the waypoint method had superior performance, and it was also most preferred by the users, closely followed by the continuous-control interface. However, the field study revealed that our wheelchair users felt uncomfortable and excluded when they had to look down at the floor to steer a vehicle. Hence, our VR testing had a simplified representation of the steering task and ignored an important part of the use-context. In the discussion, we suggest potential improvements of simulation-based design of wheelchair gaze control interfaces.