This preconference workshop has two parts. In the first part, a tutorial will cover best practices in collecting and analyzing head-mounted eye tracking data with infant participants. After the break, the second part will consist of three talks that showcase the use of head-mounted eye tracking in infant research.
Time | Speaker | Title |
---|---|---|
09:00-09:10 | Karen Adolph (New York University) | Introduction |
09:10-10:00 | John Franchak (UC Riverside) | Data Collection |
10:00-10:10 | Sara Schroer (UT Austin) | Data Quality |
10:10-10:30 | David Crandall | Analyzing egocentric video and gaze data using computer vision |
10:30-10:50 | Brianna Kaplan, Kelsey West, Justine Hoch, Karen Adolph | Head-mounted eye tracking reveals structure in children’s learning environments |
10:50-11:10 | Irina Castellanos and Derek Houston | Effects of auditory experience on parent-child interactions |
11:10-11:20 | Break | |
11:20-12:00 | Chen Yu (UT Austin) | Data Analysis |
The full video recording can be viewed from here.
Eye Tracking Accuracy Calculator - Use point of gaze images to estimate the spatial offset in degrees for a participant's calibration
ROI Coder - Matlab application for drawing regions of interest on video frames
ET Tools - Matlab scripts for importing and making heatmaps for Positive Science and Pupil Lab files
timevp - Time series visualization and processing toolkit
Franchak & Yu, 2022 - Chapter with best practices for collecting head-mounted eye tracking data with infants
Franchak, 2020 - Chapter reviewing work using head-mounted eye tracking in infants and adults in naturalistic tasks
Slone et al., 2018 - Video article showing step-by-step procedures for gathering head-mounted eye tracking data
Franchak, 2017 - Primer on head-mounted eye tracking
Franchak, Kretch, Soska, & Adolph, 2011 - Introduced the head-mounted eye tracking method with infants