logo_hiRes

Pre-Conference Workshops > Workshop 2

Workshop 2 sponsored by MELD (Multisensory Environments to study Longitudinal Development)

MELD_logo_768x768.png

Title: Sound and gaze in three dimensions - speaker based spatialized audio and eye movements in the real world

Organizers: Marcus R. Watson & Eduardo Villar Ortega

Topics: 3D speaker-based spatial audio (VBAP/Ambisonics), LSL synchronization, 3D eyetracking, multisensory

Description: Many experimenters need to present stimuli that resemble real-world objects and environments, and to record the resulting behaviours in all their glorious complexity. However, this freedom leads to very steep learning curves. This workshop will introduce attendees to open-source methods for dealing with these challenges. On the stimulus side, we will cover 3D spatial audio, and on the behavior side, we will cover gaze to arbitrary objects.

We will review speaker-array-based methods of audio spatialization (VBAP and Ambisonics). After a brief introduction to the theory behind these methods, we will learn how to (a) select optimal speaker locations for a given room, (b) use speakers at these locations to present sounds from any location in the room, (c) objectively verify the fidelity of these presentations using multi-directional microphones, and (d) synchronize them with other lab equipment using Lab Streaming Layer (LSL).

We will next review how to collect, process and analyze mobile eye-tracking data in real-world multisensory experiments, covering (a) preprocessing pipelines, including LSL-based synchronization, blink handling, and filtering of gaze signals; (b) gaze-to-stimulus mapping by projecting mobile eye-tracker coordinates onto experimental displays, with considerations for dynamic environments and participant movement; (c) area-of-interest (AOI) analysis using multiple definition strategies (circular, Voronoi, constrained Voronoi, and grid-based), with quantitative validation through spatial precision metrics such as dispersion ellipses; and (d) race-model analysis applied to both button-press reaction times and oculomotor latencies.

Attendees will leave with skills and tools that should reduce the time from conceptualizing a dynamic, three-dimensional experiment to actually implementing one.

 

Dependencies: Unity Hub + Unity (download Unity hub here: https://docs.unity.com/en-us/hub/install-hub, use it to install Unity 6000.4.1f1 or later); Python with an appropriate IDE (e.g. VS Code, Pycharm); Github repository: https://github.com/MSI-Consortium/IMRF_2026_SpatialAudio_Eyetracking

racking

Loading... Loading...