GazeR: A Package for Processing Gaze Position and Pupil Size Data

Jason Geller, Matthew B. Winn, Tristian Mahr, Daniel Mirman

Research output: Contribution to journalArticlepeer-review

50 Scopus citations


Eye-tracking is widely used throughout the scientific community, from vision science and psycholinguistics to marketing and human-computer interaction. Surprisingly, there is little consistency and transparency in preprocessing steps, making replicability and reproducibility difficult. To increase replicability, reproducibility, and transparency, a package in R (a free and widely used statistical programming environment) called gazeR was created to read and preprocess two types of data: gaze position and pupil size. For gaze position data, gazeR has functions for reading in raw eye-tracking data, formatting it for analysis, converting from gaze coordinates to areas of interest, and binning and aggregating data. For data from pupillometry studies, the gazeR package has functions for reading in and merging multiple raw pupil data files, removing observations with too much missing data, eliminating artifacts, blink identification and interpolation, subtractive baseline correction, and binning and aggregating data. The package is open-source and freely available for download and installation: We provide step-by-step analyses of data from two tasks exemplifying the package’s capabilities.

Original languageEnglish (US)
Pages (from-to)2232-2255
Number of pages24
JournalBehavior Research Methods
Issue number5
StatePublished - Oct 1 2020
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Experimental and Cognitive Psychology
  • Developmental and Educational Psychology
  • Arts and Humanities (miscellaneous)
  • Psychology (miscellaneous)
  • General Psychology


  • R
  • eye-tracking
  • open science
  • pupillometry
  • visual world paradigm


Dive into the research topics of 'GazeR: A Package for Processing Gaze Position and Pupil Size Data'. Together they form a unique fingerprint.

Cite this