Artificial Emotional Intelligence

Artificial Emotional Intelligence

Publications

Downloads

Videos

Groups

Overview

Our goal is to create systems with artificial emotional intelligence (AEI) by teaching systems to think, reason, and communicate more naturally. We’re an early leader in using machine learning to classify and model multi-modal emotional data streams as input for better human-computer interaction (HCI) and communication.

We bring together experts in deep learning, reinforcement learning, natural language processing (NLP), psychology, design, computational linguistics and more. We work on a wide variety of problems, including emotional conversational agents and novel HCI systems. We are building upon one of the largest sets of multimodal emotion data and a rich platform of tools. We build deep neural networks trained through techniques from supervised, unsupervised, and reinforcement learning. This is a unique opportunity to change the way systems learn to understand and communicate with humans more naturally, and hence, intelligently. Our work has the potential to transform products and services so that people will experience them as having emotional intelligence.

People

People

Portrait of Kael Rowan

Kael Rowan

Principal Research Software Development Engineer (RSDE)

Microsoft Research Podcast

Downloads

Getting good VIBEs from your computer with Dr. Mary Czerwinski

Episode 20, April 18, 2018 – In a world where humans are increasingly interacting with AI systems, Dr. Mary Czerwinski, Principal Researcher and Research Manager at Microsoft Research, believes emotions may be fundamental to our interactions with machines. And through her team’s work in affective computing, the quest to bring Artificial Emotional Intelligence – or AEI – to our computers may be closer than we think.

April 2018

Microsoft Research Blog

When Psychology Meets Technology with Dr. Daniel McDuff

Episode 17, March 28, 2018 – Dr. McDuff talks about why we need computers to understand us, outlines the pros and cons of designing emotionally sentient agents, explains the technology behind CardioLens, a pair of augmented reality glasses that can take your heartrate by looking at your face, and addresses the challenges of maintaining trust and privacy when we’re surrounded by devices that want to know not just what we’re doing, but how we’re feeling.

March 2018

Microsoft Research Blog