Our vision is to weave together a virtual harness for vehicles using low-cost sensing devices that communicate with a cloud-based controller. Our current effort is based on a dashboard-mounted smartphone and an OBD-II scanner for sensing, with an Azure-based backend for aggregating, analyzing, and visualizing data. The figure below depicts the architecture of HAMS.
The sensors used by HAMS include the front-camera of the smartphone that looks at the driver, the back camera that looks out to the front of the vehicle, the phone’s GPS & inertial sensors, and the OBD-II scanner, which provides information on the vehicle’s health. Based on the raw data obtained from these sensors, we have built detectors for various events of interest including driver distraction, fatigue, etc. and also for vehicle ranging, to determine whether a safe separation distance is being maintained. A key research question is how to leverage both local and cloud-based computing resources to design detectors that are both effective and efficient.
We envisage a “fleet” model for the deployment of HAMS, where the fleet can, on the one hand, comprise hundreds or thousands of cabs, buses, or trucks, being overseen by a supervisor, and a one-car fleet, on the other hand, say with a parent monitoring their teenaged child’s driving.
We have deployed an initial version of HAMS on a dozen office cabs at Microsoft Research India in Bengaluru and have gathered over 10,000 km of data over a few months.