Explore chapters and articles related to this topic
Audiovisual Interaction
Published in Nick Zacharov, Sensory Evaluation of Sound, 2018
Dominik Strohmeier, Satu Jumisko-Pyykkö
Multimodal perception is more than just the sum of quality sensation for two or more independent streams. The integration of two or more sensory channels into one experience is far more complex than a simple sum of independent sensory events as different modalities complement and modify the final perceptual experience (Shimojo and Shams, 2001; Hands, 2004). The McGurk effect is a well-known example of audiovisual interaction in which auditory and visual information are integrated into a new and unified audiovisual perception (McGurk and MacDonald, 1976). It shows that parallel processing of auditory and visual information are not independent of each other.
User Representations in Human-Computer Interaction
Published in Human–Computer Interaction, 2021
Sofia Seinfeld, Tiare Feuchtner, Antonella Maselli, Jörg Müller
Besides enhancing the detection and perceptual accuracy of real events perceived through multiple sensory channels, multisensory integration is at the root of several perceptual illusions. Well known examples can be found in speech perception, where the visual aspects of speech cues have a significant impact on the corresponding heard speech. An example is the ventriloquist effect, where the location of a heard sound is misperceived toward the location of the visually perceived speech (Alais & Burr, 2004; Burr & Alais, 2006). In the McGurk effect instead, a sound perceived through lip reading, together with a different simultaneously heard sound, results in the perceptual illusion of a different third sound. Research has shown that these effects can be explained in terms of probabilistically optimal integration mechanisms following Bayes’ Rule (Chen & Spence, 2017; Körding & Wolpert, 2004).