Behav Res Methods. 2026 Apr 21;58(5):131. doi: 10.3758/s13428-026-02972-8. ABSTRACT Eye-tracking using head-mounted systems in natural, dynamic settings presents unique challenges for accurately detecting gaze shifts (saccades). Unlike controlled, screen-based scenarios, mobile…
Behav Res Methods. 2026 Apr 21;58(5):131. doi: 10.3758/s13428-026-02972-8.
ABSTRACT
Eye-tracking using head-mounted systems in natural, dynamic settings presents unique challenges for accurately detecting gaze shifts (saccades). Unlike controlled, screen-based scenarios, mobile eye-tracking involves constant shifts in visual scenes with respect to the head due to head and body movements, complicating gaze event classification. This paper systematically evaluates several threshold- and machine-learning-based gaze-shift detection algorithms on a manually labeled dataset collected from participants walking freely outdoors. Our findings indicate that conventional threshold-based methods, despite their sensitivity to parameter settings, outperform contemporary pre-trained machine learning methods when applied to real-world dynamic conditions without retraining. However, we also demonstrated that although machine-learning-based methods perform poorly on unseen dynamic data, their performance improves substantially when they are retrained on data that closely matches the testing conditions. Moreover, we introduce a novel probabilistic approach, the Ranking method, that integrates both eye movement and visual scene information, achieving performance comparable to inter-annotator agreement, outperforming previous methods. We also report observing some gaze behaviors during manual annotation that do not fit within the classical gaze event categories, highlighting why classifying gaze events in unconstrained natural scenarios is more complex than doing so for screen-based tasks. Our work provides insights suggesting how one could improve gaze event classification performance in real-world environments in the future.
PMID:42014634 | PMC:PMC13099851 | DOI:10.3758/s13428-026-02972-8