Eye tracking technology is advancing swiftly and many areas of research have begun taking advantage of this. Existing eye trackers project gaze onto a 2D plane, whether it be the display of a head mounted virtual reality (VR) helmet or an image of a real life scene the user is in. This allows us to easily analyze what a viewer is looking at, but limits classification of gaze behaviors from this type of signal. Instead, a system that takes into account head movements within the same space as gaze velocity allows researchers to classify more advanced gaze behaviors such as smooth pursuits and fixations resulting from vestibulo-ocular reflex. For this work data is collected in real world environments where head and gaze movements are recorded over a variety of tasks. The resulting data is then used to construct a distribution of naturally occurring gaze behaviors. This distribution is then used to drive a VR data collection experiment that elicits specific gaze behaviors such as fixations and saccades with specific velocities and directions. A dataset of 12 subjects was collected while subjects play a shooting game in the virtual world. Data was analyzed to see if the intended eye movements were produced, and also to compare the eye movements that occur in fast versus slow presentation of targets.

Publication Date


Document Type


Student Type


Degree Name

Computer Science (MS)

Department, Program, or Center

Computer Science (GCCIS)


Reynold Bailey

Advisor/Committee Member

Gabriel Diaz

Advisor/Committee Member

Joe Geigel


RIT – Main Campus