Explore chapters and articles related to this topic
Multimodal Analytics for Automated Assessment
Published in Duanli Yan, André A. Rupp, Peter W. Foltz, Handbook of Automated Scoring, 2020
The psychological foundation of multimodality traces its roots to foundational principles advocated by Gestalt psychologists over a century ago who were interested in developing simple principles to explain how people perceive meaning from a complex percept (i.e., stimulus). For example, the key Gestalt principle of totality, which posits that the whole is qualitatively different than the sum of its parts, supports the idea that we integrate percepts from different modalities (e.g., linguistic and paralinguistic) to form a coherent percept that goes beyond the unimodal experience. Accordingly, neuroscience suggests that multisensory integration, or how information from different senses is integrated by the nervous system, is fundamental to cognition to the extent that it is even observable at the level of individual neurons (Meredith & Stein, 1983).
Driver vehicle interfaces and older adults
Published in Carryl L. Baldwin, Bridget A. Lewis, Pamela M. Greenwood, Designing Transportation Systems for Older Adults, 2019
Carryl L. Baldwin, Bridget A. Lewis, Pamela M. Greenwood
Beyond ensuring that an alert is detectable by an older individual who has differential impairment in at least one sensory modality, multimodal alerts may have particular benefit in speeding response time. As discussed in Chapter 3, age-related slowing of information processing is ubiquitous. However, multisensory integration is enhanced in older adults. This means that older adults benefit more than their younger counterparts when alerts or signals are presented in both visual and auditory modalities concurrently. For example, Laurienti and colleagues (2006) found that presenting stimuli in multiple modalities speeded response time in both younger and older adults, but this performance gain was particularly striking for older participants. In fact, despite being slower than younger participants to respond when stimuli were presented in either visual or auditory modalities, when stimuli were presented bimodally (both visual and auditory simultaneously). older adults were able to respond as fast as the young people could respond in their best single modality.
Multisensory integration effect of humanoid robot appearance and voice on users’ affective preference and visual attention
Published in Behaviour & Information Technology, 2022
Mingming Li, Fu Guo, Chen Fang, Fengxiang Li
Multisensory integration is vital to individual perception and action and has a wide range of influences, including behavioural outcomes, psychological feelings, and physiological arousal (Klasen, Chen, and Mathiak 2012; Cornelio, Velasco, and Obrist 2021). There are two specific effects of multisensory integration, the facilitation effect and the attenuation effect (Klasen, Chen, and Mathiak 2012; Shen and Sengupta 2014). The former occurs when different sensory information is congruent and complementary. The facilitation effect might result in fluent processing and more accurate responses. Conversely, the attenuation effect typically occurs when different sensory information conflicts with each other and might impede processing and delay responses. The multisensory integration effect has been observed in numerous areas such as emotional recognition (Pan et al. 2017), spatial attention orientation (Berthoz and Viaud-Delmon 1999), food experience (Spence and Shankar 2010), marketing promotion (Yang et al. 2022), music presentation (Lee, Latchoumane, and Jeong 2017), and product design (Özcan, Cupchik, and Schifferstein 2017).
User Representations in Human-Computer Interaction
Published in Human–Computer Interaction, 2021
Sofia Seinfeld, Tiare Feuchtner, Antonella Maselli, Jörg Müller
Besides enhancing the detection and perceptual accuracy of real events perceived through multiple sensory channels, multisensory integration is at the root of several perceptual illusions. Well known examples can be found in speech perception, where the visual aspects of speech cues have a significant impact on the corresponding heard speech. An example is the ventriloquist effect, where the location of a heard sound is misperceived toward the location of the visually perceived speech (Alais & Burr, 2004; Burr & Alais, 2006). In the McGurk effect instead, a sound perceived through lip reading, together with a different simultaneously heard sound, results in the perceptual illusion of a different third sound. Research has shown that these effects can be explained in terms of probabilistically optimal integration mechanisms following Bayes’ Rule (Chen & Spence, 2017; Körding & Wolpert, 2004).