Multimodal Devices: Learnings from Multisensory Perception and Plasticity
- Christopher Berger | Microsoft Research
We often think about perceptual experiences one modality at a time; we “see” this or we “hear” that, etc. These distinctions often imply that our perception of each sense is independent from each other. Research in human neuroscience, however, increasingly supports a multisensory view of human perception whereby our perception of any one sensory modality is a product of simultaneous information streams from all the available senses. Discrepant information between two or more senses, can lead to perceptual illusions and/or plastic changes in future perception in each sensory modality. In this talk, I will discuss how research in multisensory perception and multisensory plasticity can be used to provide more compelling and intuitive experiences for users as products become increasingly multi-modal (particularly in the domain of virtual and augmented reality).
For a recent bio of Dr. Christopher Berger visit http://www.christophercberger.com.
Speaker Details
http://www.christophercberger.com
Watch Next
-
-
-
Evaluating the Cultural Relevance of AI Models and Products: Learnings on Maternal Health ASR, Data Augmentation and User Testing Methods
- Oche Ankeli,
- Ertony Bashil,
- Dhananjay Balakrishnan
-
Magma: A foundation model for multimodal AI Agents
- Jianwei Yang
-
AI for Precision Health: Learning the language of nature and patients
- Hoifung Poon,
- Ava Amini,
- Lili Qiu
-
-
-
-
-