Preprint

Inferring target locations from gaze data: A smartphone study

This article is a preprint and has not been certified by peer review [What does this mean?].

Author(s) / Creator(s)

Mueller, Stefanie

Abstract / Description

Although smartphones are widely used in everyday life, studies of viewing behavior mainly employ desktop computers. This study examines whether closely spaced target locations on a smartphone can be decoded from gaze. Subjects wore a head-mounted eye tracker and fixated a target that successively appeared at 30 positions spaced by 10.0 x 9.0 mm. A ”hand-held” (phone in subject’s hand) and a ”mounted” (phone on surface) condition were conducted. Linear-mixed-models were fitted to examine whether gaze differed between targets. T-tests on root-mean-squared errors were calculated to evaluate the deviation between gaze and targets. To decode target positions from gaze data we trained a classifier and assessed its performance for every subject/condition. While gaze positions differed between targets (main effect ”target”), gaze deviated from the real positions. The classifier’s performance for the 30 locations ranged considerably between subjects (”mounted”: 30 to 93 % accuracy; ”hand-held”: 8 to 100 % accuracy).

Keyword(s)

fixations mobile devices accuracy gaze positions

Persistent Identifier

Date of first publication

2019

Is version of

Citation

Mueller, S. (2019). Inferring target locations from gaze data: A smartphone study. Leibniz Institut für Psychologische Information und Dokumentation (ZPID). https://doi.org/10.23668/psycharchives.2491
  • Author(s) / Creator(s)
    Mueller, Stefanie
  • PsychArchives acquisition timestamp
    2019-06-19T06:14:18Z
  • Made available on
    2019-06-19T06:14:18Z
  • Date of first publication
    2019
  • Abstract / Description
    Although smartphones are widely used in everyday life, studies of viewing behavior mainly employ desktop computers. This study examines whether closely spaced target locations on a smartphone can be decoded from gaze. Subjects wore a head-mounted eye tracker and fixated a target that successively appeared at 30 positions spaced by 10.0 x 9.0 mm. A ”hand-held” (phone in subject’s hand) and a ”mounted” (phone on surface) condition were conducted. Linear-mixed-models were fitted to examine whether gaze differed between targets. T-tests on root-mean-squared errors were calculated to evaluate the deviation between gaze and targets. To decode target positions from gaze data we trained a classifier and assessed its performance for every subject/condition. While gaze positions differed between targets (main effect ”target”), gaze deviated from the real positions. The classifier’s performance for the 30 locations ranged considerably between subjects (”mounted”: 30 to 93 % accuracy; ”hand-held”: 8 to 100 % accuracy).
    en_US
  • Publication status
    acceptedVersion
  • Review status
    peerReviewed
  • Citation
    Mueller, S. (2019). Inferring target locations from gaze data: A smartphone study. Leibniz Institut für Psychologische Information und Dokumentation (ZPID). https://doi.org/10.23668/psycharchives.2491
    en
  • Persistent Identifier
    https://hdl.handle.net/20.500.12034/2115
  • Persistent Identifier
    https://doi.org/10.23668/psycharchives.2491
  • Language of content
    eng
    en_US
  • Is version of
    http://dx.doi.org/10.1145/3314111.3319847
  • Is related to
    https://doi.org/10.23668/psycharchives.2500
  • Is related to
    https://doi.org/10.1145/3314111.3319847
  • Keyword(s)
    fixations
    en_US
  • Keyword(s)
    mobile devices
    en_US
  • Keyword(s)
    accuracy
    en_US
  • Keyword(s)
    gaze positions
    en_US
  • Dewey Decimal Classification number(s)
    150
  • Title
    Inferring target locations from gaze data: A smartphone study
    en_US
  • DRO type
    preprint
    en_US