10 Works

Sampling rate influences saccade detection in mobile eye tracking of a reading task

Wahl, Siegfried; Institute For Ophthalmic Research, Eberhard Karls University Tuebingen, Leube, Alexander; Institute For Ophthalmic Research, Eberhard Karls University Tuebingen & Rifai, Katharina; Institute For Ophthalmic Research, Eberhard Karls University Tuebingen
The purpose of this study was to compare saccade detection characteristics in two mobile eye trackers with different sampling rates in a natural task. Gaze data of 11 participants were recorded in one 60 Hz and one 120 Hz mobile eye tracker and compared directly to the saccades detected by a 1000 HZ stationary tracker while a reading task was performed. Saccades and fixations were detected using a velocity based algorithm and their properties analyzed....

A Skeleton-based Approach to Analyze and Visualize Oculomotor Behavior when Viewing Animated Characters

Thibaut Le Naour & Bresciani, Jean-Pierre; University Of Fribourg
Knowing what people look at and understanding how they analyze the dynamic gestures of their peers is an exciting challenge. In this context, we propose a new approach to quantify and visualize the oculomotor behavior of viewers watching the movements of animated characters in dynamic sequences. Using this approach, we were able to illustrate on a 'heat mesh' the gaze distribution of one or several viewers, i.e., the time spent on each part of the...

Does descriptive text change how people look at art? A novel analysis of eye-movements using data-driven Units of Interest

Reani, Manuele; University Of Manchester, Vigo, Markel; University Of Manchester, Grimes, Martin; Manchester City Galleries, Davies, Alan; University Of Manchester, Jay, Caroline; University Of Manchester, Gannaway, Clare; Manchester Art Gallery & Harper, Simon; University Of Manchester
Does reading a description of an artwork affect how a person subsequently views it? In a controlled study, we show that in most cases, textual description does not influence how people subsequently view paintings, contrary to participants’ self-report that they believed it did. To examine whether the description affected transition behaviour, we devised a novel analysis method that systematically determines Units of Interest (UOIs), and calculates transitions between these, to quantify the effect of an...

Eye-tracking Analysis of Interactive 3D Geovisualization

Herman, Lukas; Masaryk University, Brno, Hejlova, Vendula; Palacky University, Olomouc & Popelka, Stanislav; Palacky University, Olomouc
This paper describes a new tool for eye-tracking data and their analysis with the use of interactive 3D models. This tool helps to analyse interactive 3D models easier than by time-consuming, frame-by-frame investigation of captured screen recordings with superimposed scanpaths. The main function of this tool, called 3DgazeR, is to calculate 3D coordinates (X, Y, Z coordinates of the 3D scene) for individual points of view. These 3D coordinates can be calculated from the values...

Scan path visualization and comparison using visual aggregation techniques

Hurter, Christophe; ENAC & Peysakhovich, Vsevolod; ISAE-Supaéro
We demonstrate the use of different visual aggregation techniques to obtain non-cluttered visual representations of scanpaths. First, fixation points are clustered using the mean-shift algorithm. Second, saccades are aggregated using the Attribute-Driven Edge Bundling (ADEB) algorithm that handles a saccades direction, onset timestamp, magnitude or their combination for the edge compatibility criterion. Flow direction maps, computed during bundling, can be visualized separately (vertical or horizontal components) or as a single image using the Oriented Line...

Cyclopean vs. Dominant Eye in Gaze-Interface-Tracking

Wagner, Michael; Ariel University, Botzer, Assaf; Ariel University & Elbaum, Tomer; Ariel University
User-centered design questions in gaze interfaces have been explored in multitude empirical investigations. Interestingly, the question of what eye should be the input device has never been studied. We compared tracking accuracy between the “cyclopean” (i.e., midpoint between eyes) dominant and non-dominant eye. In two experiments, participants performed tracking tasks. In Experiment 1, participants did not use a crosshair. Results showed that mean distance from target was smaller with cyclopean than with dominant or non-dominant...

Eye tracking and visualization. Introduction to the Special Thematic Issue

Burch, Michael; Visualization Research Center University Of Stuttgart, Germany, Chuang, Lewis L.; Max Planck Institute For Biological Cybernetics, Tübingen, Germany, Groner, Rudolf; Journal Of Eye Movement Research; SCIANS Ltd, And University Of Bern, Duchowski, Andrew; Clemson University, USA & Daniel, Weiskopf; Visualization Research Center University Of Stuttgart, Germany
There is a growing interest in eye tracking technologies applied to support traditional visualization techniques like diagrams, charts, maps, or plots, either static, animated, or interactive ones. More complex data analyses are required to derive knowledge and meaning from the data. Eye tracking systems serve that purpose in combination with biological and computer vision, cognition, perception, visualization, human-computer-interaction, as well as usability and user experience research. The 10 articles collected in this thematic special issue...

Using simultaneous scanpath visualization to investigate the influence of visual behaviour on medical image interpretation

Davies, Alan Richard; University Of Manchester, Harper, Simon; University Of Manchester, Jay, Caroline; University Of Manchester & Vigo, Markel; University Of Manchester
In this paper, we explore how a number of novel methods for visualizing and analyzing differences in eye-tracking data, including scanpath length, Levenshtein distance, and visual transition frequency, can help to elucidate the methods clinicians use for interpreting 12-lead electrocardiograms (ECGs). Visualizing the differences between multiple participants’ scanpaths simultaneously allowed us to answer questions including: do clinicians fixate randomly on the ECG, or do they apply a systematic approach?; is there a relationship between interpretation...

Uncertainty visualization of gaze estimation to support operator-controlled calibration

Peysakhovich, Vsevolod; ISAE, Hassoumi, Almoctar; ENAC & Hurter, Christophe; ENAC
In this paper, we investigate how visualization assets can support the qualitative evaluation of gaze estimation uncertainty. Although eye tracking data are commonly available, little has been done to visually investigate the uncertainty of recorded gaze information. This paper tries to fill this gap by using innovative uncertainty computation and visualization. Given a gaze processing pipeline, we estimate the location of this gaze position in the world camera. To do so we developed our own...

Visual Multi-Metric Grouping of Eye-Tracking Data

Burch, Michael; Eindhoven Uni. Of Tech., Netherlands, Netzel, Rudolf; University Of Stuttgart, Mueller, Klaus; Stony Brook University, New York, Weiskopf, Daniel; University Of Stuttgart & Kumar, Ayush; Stony Brook University, New York
We present an algorithmic and visual grouping of participants and eye-tracking metrics derived from recorded eye-tracking data. Our method utilizes two well-established visualization concepts. First, parallel coordinates are used to provide an overview of the used metrics, their interactions, and similarities, which helps select suitable metrics that describe characteristics of the eye-tracking data. Furthermore, parallel coordinates plots enable an analyst to test the effects of creating a combination of a subset of metrics resulting in...

Registration Year

  • 2018
    10

Resource Types

  • Workflow
    10

Data Centers

  • BOP: JEMR
    10