Even as the sophistication and power of computer-based vision systems is growing, the human visual system remains unsurpassed in many visual tasks. Vision delivers a rich representation of the environment without conscious effort, but the perception of a high resolution, wide field-of-view scene is largely an illusion made possible by the concentration of visual acuity near the center of gaze, coupled with a large, low-acuity periphery. Human observers are typically unaware of this extreme anisotropy because the visual system is equipped with a sophisticated oculomotor system that rapidly moves the eyes to sample the retinal image several times every second. The eye movements are programmed and executed at a level below conscious awareness, so self-report is an unreliable way to learn how trained observers perform complex visual tasks. Eye movements in controlled laboratory conditions have been studied extensively, but their utility as a metric of visual performance in real world, complex tasks, offers a powerful, under-utilized tool for the study of high-level visual processes. Recorded gaze patterns provide externally-visible markers to the spatial and temporal deployment of attention to objects and actions. In order to study vision in the real world, we have developed a self-contained, wearable eyetracker for monitoring complex tasks. The eyetracker can be worn for an extended period of time, does not restrict natural movements or behavior, and preserves peripheral vision. The wearable eyetracker can be used to study performance in a range of visual tasks, from situational awareness to directed visual search.

Date of creation, presentation, or exhibit



Proceedings of MSS-CCD (2003) This article may also be accessed on the College of Imaging Science's website at: Note: imported from RIT’s Digital Media Library running on DSpace to RIT Scholar Works in February 2014.

Document Type

Conference Proceeding

Department, Program, or Center

Chester F. Carlson Center for Imaging Science (COS)


RIT – Main Campus