Display options
Share it on

Behav Res Methods. 2017 Aug;49(4):1310-1322. doi: 10.3758/s13428-016-0806-1.

Real-time sharing of gaze data between multiple eye trackers-evaluation, tools, and advice.

Behavior research methods

Marcus Nyström, Diederick C Niehorster, Tim Cornelissen, Henrik Garde

Affiliations

  1. Humanities Lab, Lund University, Helgonabacken 12, Lund, SE, 22362, Sweden. [email protected].
  2. Humanities Lab, Lund University, Helgonabacken 12, Lund, SE, 22362, Sweden.
  3. Scene Grammar Lab, Goethe University Frankfurt, Theodor-W.-Adorno-Platz 6, Frankfurt am Main, DE, 60323, Germany.

PMID: 27743316 PMCID: PMC5541105 DOI: 10.3758/s13428-016-0806-1

Abstract

Technological advancements in combination with significant reductions in price have made it practically feasible to run experiments with multiple eye trackers. This enables new types of experiments with simultaneous recordings of eye movement data from several participants, which is of interest for researchers in, e.g., social and educational psychology. The Lund University Humanities Laboratory recently acquired 25 remote eye trackers, which are connected over a local wireless network. As a first step toward running experiments with this setup, demanding situations with real time sharing of gaze data were investigated in terms of network performance as well as clock and screen synchronization. Results show that data can be shared with a sufficiently low packet loss (0.1 %) and latency (M = 3 ms, M A D = 2 ms) across 8 eye trackers at a rate of 60 Hz. For a similar performance using 24 computers, the send rate needs to be reduced to 20 Hz. To help researchers conduct similar measurements on their own multi-eye-tracker setup, open source software written in Python and PsychoPy are provided. Part of the software contains a minimal working example to help researchers kick-start experiments with two or more eye trackers.

Keywords: Digital classroom; Eye tracking; Shared gaze

References

  1. Vision Res. 2011 May 11;51(9):997-1012 - PubMed
  2. Q J Exp Psychol (Hove). 2011 Apr;64(4):649-56 - PubMed
  3. Cognition. 2008 Mar;106(3):1465-77 - PubMed
  4. Front Hum Neurosci. 2015 Oct 14;9:526 - PubMed
  5. J Neurosci Methods. 2007 May 15;162(1-2):8-13 - PubMed
  6. Front Neuroinform. 2009 Jan 15;2:10 - PubMed
  7. Behav Res Methods. 2014 Dec;46(4):913-21 - PubMed

MeSH terms

Publication Types