iCatcher: A neural network approach for automated coding of young children's eye movements

Yotam Erel, Christine E. Potter, Sagi Jaffe-Dax, Casey Lew-Williams, Amit H. Bermano

Research output: Contribution to journalArticlepeer-review

Abstract

Infants' looking behaviors are often used for measuring attention, real-time processing, and learning—often using low-resolution videos. Despite the ubiquity of gaze-related methods in developmental science, current analysis techniques usually involve laborious post hoc coding, imprecise real-time coding, or expensive eye trackers that may increase data loss and require a calibration phase. As an alternative, we propose using computer vision methods to perform automatic gaze estimation from low-resolution videos. At the core of our approach is a neural network that classifies gaze directions in real time. We compared our method, called iCatcher, to manually annotated videos from a prior study in which infants looked at one of two pictures on a screen. We demonstrated that the accuracy of iCatcher approximates that of human annotators and that it replicates the prior study's results. Our method is publicly available as an open-source repository at https://github.com/yoterel/iCatcher.

Original languageEnglish (US)
JournalInfancy
DOIs
StateAccepted/In press - 2022

All Science Journal Classification (ASJC) codes

  • Pediatrics, Perinatology, and Child Health
  • Developmental and Educational Psychology

Fingerprint

Dive into the research topics of 'iCatcher: A neural network approach for automated coding of young children's eye movements'. Together they form a unique fingerprint.

Cite this