HAMS: Harnessing AutoMobiles for Safety

Established: May 1, 2015

Overview

Context

Road safety is a major public health issue, accounting for an estimated 1.25 million fatalities, and many more injuries, the world over, each year, placing it among the top 10 causes of death. Middle-income and particularly low-income countries bear a disproportionate burden of road accidents and fatalities. For instance, the estimates of road fatalities in India range from one every 4 minutes to almost a quarter of a million, or 20% of the world’s total, each year. Besides the heavy human cost, road accidents also impose a significant economic cost. So it is no surprise that the problem has attracted the attention at the highest levels of the government, including from Prime Minister Modi himself during a radio address in 2015.

The major ingredients impacting safety — vehicles, roads, and people — see little or no ongoing monitoring today, especially in countries such as India. It is our thesis that improving road conditions, vehicle health and, most importantly, driver discipline would help boost road safety. Indeed, among the leading causes of road accidents are such factors as speeding, drunk driving, and driver distractions, all of which can be mitigated through better driver discipline. The key to bringing about an improvement is effective monitoring leading to actionable feedback.

HAMS Overview

Our vision is to weave together a virtual harness for vehicles using low-cost sensing devices that communicate with a cloud-based controller. Our current effort is based on a dashboard-mounted smartphone and an OBD-II scanner for sensing, with an Azure-based backend for aggregating, analyzing, and visualizing data. The figure below depicts the architecture of HAMS.

hams-arch

The sensors used by HAMS include the front-camera of the smartphone that looks at the driver, the back camera that looks out to the front of the vehicle, the phone’s GPS & inertial sensors, and the OBD-II scanner, which provides information on the vehicle’s health. Based on the raw data obtained from these sensors, we have built detectors for various events of interest including driver distraction, fatigue, etc. and also for vehicle ranging, to determine whether a safe separation distance is being maintained. A key research question is how to leverage both local and cloud-based computing resources to design detectors that are both effective and efficient.

We envisage a “fleet” model for the deployment of HAMS, where the fleet can, on the one hand, comprise hundreds or thousands of cabs, buses, or trucks, being overseen by a supervisor, and a one-car fleet, on the other hand, say with a parent monitoring their teenaged child’s driving.

We have deployed an initial version of HAMS on a dozen office cabs at Microsoft Research India in Bengaluru and have gathered over 10,000 km of data over a few months.

Videos

We are working actively on refining and augmenting the system, here are some screenshots and videos of the tools we have built showing various aspects of the system:

Screenshot: Detection of driver talking on the phone:

hams-visualizationdemo-talkingonphone

Screenshot: Detection of driver not wearing a seatbelt:

hams-visualizationdemo-noseatbelt

Video: HAMS Visualization Demo

Screenshot: Camera-based vehicle ranging:

hams-vehicleranging-1 hams-vehicleranging-2

Video: HAMS Vehicle Ranging Demo:

People

Interns and collaborators:

Vivek Yenamandra (Ohio State University, intern, summer 2015)

Amod Agarwal (IIIT Delhi, intern, summer 2016)

Ravi Bhandari (IIT Bombay, intern, summer 2016)

Shibsankar Das (IISc, intern, summer 2016)

Puneeth Meruva (MIT, intern, summer 2016)

Deepak Mahendrakar (PESIT, part-time intern, autumn 2016)

Abhishek V (PESIT, part-time intern, autumn 2016)

Stay tuned for updates in the coming months!

People