Abstract

The study of human vision can include our interaction with objects. These studies include behavior modeling, understanding visual attention and motor guidance, and enhancing user experiences. But all these studies have one thing in common. To analyze the data in detail, researchers typically have to analyze video data frame by frame. Real world interaction data often comprises of data from both eye and hand. Analyzing such data frame by frame can get very tedious and time-consuming. A calibrated scene video from an eye-tracker captured at 120 Hz for 3 minutes has over 21,000 frames to be analyzed.

Automating the process is crucial to allow interaction research to proceed. Research in object recognition over the last decade now allows eye-movement data to be analyzed automatically to determine what a subject is looking at and for how long. I will describe my research in which I developed a pipeline to help researchers analyze interaction data including eye and hand. Inspired by a semi-automated pipeline for analyzing eye tracking data, I have created a pipeline for analyzing hand grasp along with gaze. Putting both pipelines together can help researchers analyze interaction data.

The hand-grasp pipeline detects skin to locate the hands, then determines what object (if any) the hand is over, and where the thumbs/fingers occluded that object. I also compare identification with recognition throughout the pipeline. The current pipeline operates on independent frames; future work will extend the pipeline to take advantage of the dynamics of natural interactions.

Library of Congress Subject Headings

Eye tracking--Data processing; Vision--Research

Publication Date

8-2-2019

Document Type

Thesis

Student Type

Graduate

Degree Name

Imaging Science (MS)

Department, Program, or Center

Chester F. Carlson Center for Imaging Science (COS)

Advisor

Jeff Pelz

Advisor/Committee Member

Gabriel Diaz

Advisor/Committee Member

Guoyu Lu

Campus

RIT – Main Campus

Plan Codes

IMGS-MS

Share

COinS