Please use this identifier to cite or link to this item:
Title: Inferring target locations from gaze data: A smartphone study
Authors: Mueller, Stefanie
Issue Date: 2019
Abstract: Although smartphones are widely used in everyday life, studies of viewing behavior mainly employ desktop computers. This study examines whether closely spaced target locations on a smartphone can be decoded from gaze. Subjects wore a head-mounted eye tracker and fixated a target that successively appeared at 30 positions spaced by 10.0 x 9.0 mm. A ”hand-held” (phone in subject’s hand) and a ”mounted” (phone on surface) condition were conducted. Linear-mixed-models were fitted to examine whether gaze differed between targets. T-tests on root-mean-squared errors were calculated to evaluate the deviation between gaze and targets. To decode target positions from gaze data we trained a classifier and assessed its performance for every subject/condition. While gaze positions differed between targets (main effect ”target”), gaze deviated from the real positions. The classifier’s performance for the 30 locations ranged considerably between subjects (”mounted”: 30 to 93 % accuracy; ”hand-held”: 8 to 100 % accuracy).
Citation: Mueller, S. (2019). Inferring target locations from gaze data: A smartphone study. Leibniz Institut für Psychologische Information und Dokumentation (ZPID).
Appears in Collections:Preprint

Files in This Item:
File Description SizeFormat 
ETRA_Mueller_2019.pdf555,52 kBAdobe PDF Preview PDF Download

This item is licensed under a Creative Commons License Creative Commons