To compare methods of displaying speech-recognition confidence of automatic captions, we analyzed eye-tracking and response data from deaf or hard of hearing participants viewing videos.
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Department, Program, or Center
School of Information (GCCIS)
Rathbun, Kevin; Berke, Larwan; Caulfield, Christopher; Stinson, Michael; and Huenerfauth, Matt, "Eye Movements of Deaf and Hard of Hearing Viewers of Automatic Captions" (2017). Journal on Technology and Persons with Disabilities, 5 (), 130-140. Accessed from
RIT – Main Campus