Portrait of Dan Morris

Dan Morris

Principal Researcher and Deliverer of Rock

About

I’m a Principal Researcher in the Microsoft Medical Devices Group.

My interests include:

Sensors and machine learning for health and fitness
Computer support for music and creativity
Sensors and machine learning for input systems
Making medical records more intuitive to patients

Before coming to Microsoft, my research focused on:

Haptics and physical simulation for medical applications
Neural prosthetics and brain-computer interfaces

And I was part of the team that made Songsmith, which lets musical novices get a taste of songwriting, just by singing, and provides songwriters with an “intelligent scratchpad”. It’s totally sweet. Try it.

My personal page lives here.  I also blog about nerdy chord theory stuff here.

 

Projects

Enhancing Input On and Above the Interactive Surface with Muscle Sensing

Current interactive surfaces provide little or no in-formation about which fingers are touching the surface, the amount of pressure exerted, or gestures that occur when not in contact with the surface. These limitations constrain the interaction vocabulary available to interactive surface systems. In our work, we extend the surface interaction space by using muscle sensing to provide complementary information about finger movement and posture. In this paper, we describe a novel system that combines muscle…

Muscle-Computer Interfaces (muCIs)

Established: November 28, 2016

Many human-computer interaction technologies are currently mediated by physical transducers such as mice, keyboards, pens, dials, and touch-sensitive surfaces. While these transducers have enabled powerful interaction paradigms and leverage our human expertise in interacting with physical objects, they tether computation to a physical artifact that has to be within reach of the user. As computing and displays begin to integrate more seamlessly into our environment and are used in situations where the user is not…

Humantenna: Sensing Gestures Using the Body as an Antenna

Computer vision and inertial measurement have made it possible for people to interact with computers using whole-body gestures. Although there has been rapid growth in the uses and applications of these systems, their ubiquity has been limited by the high cost of heavily instrumenting either the environment or the user. In this paper, we use the human body as an antenna for sensing whole-body gestures. Such an approach requires no instrumentation to the environment, and…

AirWave: Non-Contact Haptic Feedback Using Air Vortex Rings

Input modalities such as speech and gesture allow users to interact with computers without holding or touching a physical device, thus enabling at-a-distance interaction. It remains an open problem, however, to incorporate haptic feedback into such interaction. In this work, we explore the use of air vortex rings for this purpose. Unlike standard jets of air, which are turbulent and dissipate quickly, vortex rings can be focused to travel several meters and impart perceptible feedback.…

Skinput: Appropriating the Body as an Input Surface

Established: November 28, 2016

We present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body. We collect these signals using a novel array of sensors worn as an armband. This approach provides an always available, naturally portable, and on-body finger input system. We assess…

SoundWave: Using the Doppler Effect to Sense Gestures

Established: November 28, 2016

Gesture is becoming an increasingly popular means of interacting with computers. However, it is still relatively costly to deploy robust gesture recognition sensors in existing mobile platforms. We present SoundWave, a technique that leverages the speaker and microphone already embedded in most commodity devices to sense in-air gestures around the device. To do this, we generate an inaudible tone, which gets frequency-shifted when it reflects off moving objects like the hand. We measure this shift…

Always-Available Mobile Interfaces

We have continually evolved computing to not only be more efficient, but also more accessible, more of the time (and place), and to more people. We have progressed from batch computing with punch cards, to interactive command line systems, to mouse-based graphical user interfaces, and more recently to mobile computing. Each of these paradigm shifts has drastically changed the way we use technology for work and life, often in unpredictable and profound ways. With the…

SearchBar: A Search-Centric Web History

Established: November 28, 2016

Current user interfaces for Web search, including browsers and search engine sites, typically treat search as a transient activity. However, people often conduct complex, multi-query investigations that may span long durations and may be interrupted by other tasks. In this paper, we first present the results of a survey of users’ search habits, which show that many search tasks span long periods of time. We then introduce SearchBar, a system for proactively and persistently storing…

ClassSearch: A Classroom Environment for Teaching Web Search Skills

We explore the use of social learning — improving knowledge skills by observing peer behavior — in the domain of Web search skill acquisition, focusing specifically on co-located classroom scenarios. Through a series of interviews, pilot studies, and classroom deployments, we conclude that a peripheral display of Web search activity within a classroom facilitates both social learning and teacher-led discourse. We present the ClassSearch system for shared awareness of Web search activity, which embodies principles…

SuperBreak: Using Interactivity to Enhance Ergonomic Typing Breaks

Established: November 28, 2016

Repetitive strain injuries and ergonomics concerns have become increasingly significant health issues as a growing number of individuals frequently use computers for long periods of time. Currently, limited software mechanisms exist for managing ergonomics; the most well-known are "break-reminder" packages that schedule and recommend typing breaks. Yet despite the proven benefits of taking breaks, such packages are rarely adopted due to the over-head of introducing periodic interruptions into a user.s workflow. In this paper, we…

Computational Tools for Music

Work in this area seeks to use computational tools to enable musical creativity, in particular to give novices a variety of new approaches to experience musical creativity. Signal processing and machine learning systems are combined with insights from traditional music creation processes to develop new tools and new paradigms for music. Projects Songsmith Songsmith generates musical accompaniment to match a singer’s voice. Just choose a musical style, sing into your PC’s microphone, and Songsmith will…

User-Specific Training for Vocal Melody Transcription

Established: November 23, 2016

Overview This page contains supplementary material for our AAAI 2010 paper: “User-Specific Learning for Recognizing a Singer’s Intended Pitch”. The full citation for our paper follows, along with a link to the paper itself: Guillory A, Basu S, and Morris D. User-Specific Learning for Recognizing a Singer’s Intended Pitch. Proceedings of AAAI 2010, July 2010. For more information about this work, contact Dan Morris (dan@microsoft.com) and Sumit Basu (sumitb@microsoft.com). Abstract We consider the problem of…

Data-Driven Exploration of Musical Chord Sequences

We present data-driven methods for supporting musical creativity by capturing the statistics of a musical database. Specifically, we introduce a system that supports users in exploring the high-dimensional space of musical chord sequences by parameterizing the variation among chord sequences in popular music. We provide a novel user interface that exposes these learned parameters as control axes, and we propose two automatic approaches for defining these axes. One approach is based on a novel clustering…

MySong: Automatic Accompaniment for Vocal Melodies

Established: November 23, 2016

Want to Give MySong a Try? MySong is now Songsmith; you can download a free trial (or the full version, if you’re a teacher who wants to use it in your classroom) at: http://research.microsoft.com/songsmith Happy Songsmith’ing! Like to Write Music? Most folks never get a chance to answer this question, since writing music takes years of experience... if you don’t play an instrument or spend lots of time around music, you’ll probably never get to…

Dynamic Mapping of Physical Controls for Tabletop Groupware

Multi-touch interactions are a promising means of control for interactive tabletops. However, a lack of precision and tactile feedback makes multi-touch controls a poor fit for tasks where precision and feedback are crucial. We present an approach that offers precise control and tactile feedback for tabletop systems through the integration of dynamically re-mappable physical controllers with the multi-touch environment, and we demonstrate this approach in our collaborative tabletop audio editing environment. An observational user study…

Workout: Using a Wearable Sensor to Find, Recognize, and Count Repetitive Exercises

Established: November 22, 2016

Although numerous devices exist to track and share exercise routines based on running and walking, these devices offer limited functionality for strength-training exercises. We introduce a system for automatically tracking repetitive exercises – such as weight training and calisthenics – via an arm-worn inertial sensor. Our goal is to provide real-time and post-workout feedback, with no user-specific training and no intervention during a workout. Toward this end, we address three challenges: (1) Segmenting exercise from…

Patient-Friendly Medical Information Displays

Established: October 5, 2016

Patients’ basic understanding of clinical events has been shown to dramatically improve patient care. Unfortunately, patients are frequently under-informed and unclear about our own hospital/clinical courses. The recent emergence of Electronic Medical Records (EMRs) and Personal Health Records (PHRs) makes vast amounts of data available to patients, but does little to help patients understand that data. Our work focuses on designing and building simplified information displays that will help patients understand our medical treatment and…

Publications

2016

2015

2014

2013

2012

2011

2010

2009

2008

2007

2006

The CHAI Libraries
F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, Dan Morris, L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. Salisbury, in Proceedings of Eurohaptics 2003, Interaction Design Foundation, July 9, 2006, View abstract, Download PDF

2005

2004

2003

2000

Projects

Link description

Skinput

Date

November 27, 2016