On auditory-visual interaction in real and virtual environments Bernhard U. Seeber* and Hugo Fastl AG Technische Akustik, MMK, TU Muenchen, Arcisstr. 21, 80333 Muenchen, Germany *now at: Auditory Perception Lab, UC Berkeley, Berkeley, CA 94720-1650, USA, ph.: 1-510-643-8408, fax: 1-510-642-5293 The ventriloquism-effect describes an attracting influence of visual objects on perceived auditory directions. In the ventriloquist example the spectators will perceive the sound as coming from the mouth of the puppet as it's lips make synchronous movements to the speech, although the ventriloquist speaks. A new localization method utilizing a laser-pointer was developed which allows for a fast and accurate collection of localization responses. By using a trackball as an input device the experiments were laid out bimodally. The interference of proprioceptive information could thus be decoupled from the auditory-visual interaction experiment. When simulating directions in auditory displays the question comes up how the auditory directional shift towards the visual object is affected by a reduction of auditory directional information, i.e. the omission of individualized auditory directional cues. Two opposite outcomings could occur: (1) the directional shift increases as the visual modality gains more weight against the less accurate and reliable, "weaker" auditory modality, or (2) the directional shift is reduced through a reduction of cognitive congruence between both objects. Auditory-visual interaction was therefore investigated in three different listening environments: (1) real, anechoic space, (2) virtual acoustics using individual head-related transfer functions (HRTFs) or (3) selected non-individual HRTFs. The subjects task was to fixate visual objects while listening to auditory targets. Localization responses were collected as trial-by-trial aftereffects. The study showed auditory directional shifts of up to 7 deg towards visual objects in the real and in the individualized virtual environment. Directional shifts were statistically similar for both environments. Using selected non-individual HRTFs smaller shifts were observed. As experimental conditions were similar in all environments except for the directional presentation the results suggest that auditory-visual interaction is dependend on the presentation of auditory directional cues.