{"id":320399,"date":"2016-11-12T07:23:49","date_gmt":"2016-11-12T15:23:49","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&#038;p=320399"},"modified":"2022-07-18T06:11:34","modified_gmt":"2022-07-18T13:11:34","slug":"hams","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/hams\/","title":{"rendered":"HAMS: Harnessing AutoMobiles for Safety"},"content":{"rendered":"<h3 style=\"text-align: center\">New &#8211; <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/aka.ms\/hams-dashboard\" target=\"_blank\" rel=\"noopener noreferrer\">HAMS Automated License Testing Dashboard<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> is Live!<\/h3>\n<h3>Context<\/h3>\n<p>Road safety is a major public health issue, accounting for an estimated <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/en.wikipedia.org\/wiki\/List_of_countries_by_traffic-related_death_rate\" target=\"_blank\" rel=\"noopener noreferrer\">1.35 million fatalities<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, and many more injuries, the world over, each year, placing it among the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/www.who.int\/mediacentre\/factsheets\/fs310\/en\/\" target=\"_blank\" rel=\"noopener noreferrer\">top 10 causes of death<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>. Middle-income and particularly low-income countries bear a disproportionate burden of road accidents and fatalities. For instance, the estimates of road fatalities in India range from <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/www.thehindu.com\/news\/national\/road-accidents-claim-1-life-every-4-minutes-in-india\/article7747148.ece\" target=\"_blank\" rel=\"noopener noreferrer\">one every 4 minutes<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> to almost <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.who.int\/violence_injury_prevention\/road_traffic\/countrywork\/ind\/en\/\" target=\"_blank\" rel=\"noopener noreferrer\">a quarter of a million, or 20% of the world\u2019s total<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, each year. Besides the heavy human cost, road accidents also impose a significant economic cost. So it is no surprise that the problem has attracted attention at the highest levels of the government, including from <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/www.thehindu.com\/news\/national\/mann-ki-baat-pm-speaks-on-road-safety-social-issues\/article7466865.ece\" target=\"_blank\" rel=\"noopener noreferrer\">Prime Minister Modi himself during a radio address in 2015<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.<\/p>\n<p>The major factors impacting safety \u2014 vehicles, roads, and drivers \u2014 see little or no ongoing monitoring today, especially in countries such as India. It is our thesis that improving road conditions, vehicle health and, most importantly, driver discipline would help boost road safety. Indeed, among <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/www.who.int\/mediacentre\/factsheets\/fs358\/en\/\" target=\"_blank\" rel=\"noopener noreferrer\">the leading causes of road accidents<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> are such factors as speeding, drunk driving, and driver distractions, all of which can be mitigated through better driver discipline.<\/p>\n<h3>HAMS overview<\/h3>\n<p>In the Harnessing AutoMobiles for Safety, or HAMS, project, we use low-cost sensing devices to construct a <i>virtual harness<\/i> for vehicles. The goal is to monitor the state of the driver and how the vehicle is being driven in the context of a road environment that the vehicle is in. We believe that effective monitoring leading to actionable feedback is key to promoting road safety.<\/p>\n<div id=\"attachment_618591\" style=\"width: 382px\" class=\"wp-caption alignright\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-618591\" class=\"wp-image-618591 \" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/HAMS_overview-300x176.png\" alt=\"\" width=\"372\" height=\"219\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/HAMS_overview-300x176.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/HAMS_overview-768x450.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/HAMS_overview-1024x600.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/HAMS_overview.png 1387w\" sizes=\"auto, (max-width: 372px) 100vw, 372px\" \/><p id=\"caption-attachment-618591\" class=\"wp-caption-text\">Smartphone setup in HAMS<\/p><\/div>\n<p style=\"text-align: left\">The sensing device employed in HAMS is an off-the-shelf smartphone. The smartphone is mounted on the windshield, with its front camera facing the driver and the rear camera looking out to the front. The key to the operation of HAMS is the use of multiple sensors simultaneously. For example, when a sharp braking event is detected (using the smartphone\u2019s accelerometer), the distance to the vehicle in front is checked (using the rear camera), along with indications of driver distraction or fatigue (using the front camera). Such sensing and detection in tandem helps provide a holistic and accurate picture of how the vehicle is being driven, enabling appropriate feedback to then be generated.<\/p>\n<p>The research challenges we address in HAMS pertain to effective detection and monitoring in challenging settings. Some of these challenges arises because HAMS is retrofitted onto legacy vehicles, and so it must contend with variation in vehicle configuration, driver seating, and even the smartphone mounting. Other challenges arise because of our goal to be broadly applicable, including in regions where we cannot count on well-marked, fixed-width lanes, for instance, to perform vehicle ranging. We also address the challenge of efficient operation on a smartphone with modest resources, for instance, by combining accurate deep learning models with less expensive traditional computer vision techniques.<\/p>\n<p>As part of the project, we have also explored several use cases for HAMS. One of the earliest we prototyped was a fleet management dashboard, which allowed a supervisor to view safety-related incidents of interest offline. We have also piloted HAMS in the context of driver training, in collaboration with the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/idtr.in\/\" target=\"_blank\" rel=\"noopener noreferrer\">Institute of Driving and Traffic Research (IDTR)<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, run by Maruti-Suzuki, the largest passenger car manufacturer in India.<\/p>\n<p><strong>More recently, we have been working with several State Transport Departments on using HAMS to automate the driver license test<\/strong>. <em>See <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/hams\/#!automated-driver-license-testing\">Automated License Testing<\/a> for more details.<\/em><\/p>\n<p><iframe loading=\"lazy\" title=\"HAMS: Smartphone-based Driver License Testing Automation\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube-nocookie.com\/embed\/XtgpWXM5Hfg?feature=oembed&rel=0\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<p>&nbsp;<\/p>\n<p><em>Visit our other tabs at the top of this page for more information.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In the Harnessing AutoMobiles for Safety (HAMS) project, we use low-cost sensing devices to construct a virtual harness for vehicles. The goal is to monitor the state of the driver and how the vehicle is being driven in the context of a road environment that the vehicle is in. We believe that effective monitoring leading to actionable feedback is key to promoting road safety.<\/p>\n","protected":false},"featured_media":618282,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13556,13547],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-320399","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-systems-and-networking","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"","related-publications":[589213,551334,551316,551304,589186,615204,618660,689649,689655,702715,754768,976119],"related-downloads":[],"related-videos":[451071,616737,619272],"related-groups":[144725,144784,602169],"related-events":[],"related-opportunities":[],"related-posts":[1127448],"related-articles":[],"tab-content":[{"id":0,"name":"Research","content":"We now outline key research aspects of HAMS\r\n\r\n[accordion]\r\n[panel header=\"FarSight: Smartphone-based Vehicle Ranging\"]\r\n\r\nStudent: Aditya Virmani (Research Fellow, 2017-18)\r\n\r\nResearchers: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/akshayn\/\">Akshay Nambi<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/padmanab\/\">Venkat Padmanabhan<\/a>\r\n\r\nMaintaining an adequate separation with respect to the vehicle in front is key to safe driving. Indeed, the two-second rule mandates maintaining a separation of at least 2 seconds of travel distance at the current speed of the vehicle. While technologies such as Radar and Lidar enable vehicle ranging, these are not available in legacy vehicles and would also be expensive to retrofit.\r\n\r\nIn FarSight, vehicle ranging is performed using just the rear camera of a windshield-mounted smartphone. By identifying the class of vehicle in front (e.g., autorickshaw vs. sedan vs. bus) and a bounding box around it, FarSight uses simple trigonometry to estimate the range based on the approximate width for vehicles in the identified class.\r\n\r\n[caption id=\"attachment_618267\" align=\"alignnone\" width=\"653\"]<img class=\"wp-image-618267 size-full\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/FarSight_vehicleDetect.png\" alt=\"\" width=\"653\" height=\"434\" \/> Heterogeneous vehicle identification in FarSight[\/caption]\r\n\r\nIdentifying a tight bounding box around the vehicle in front is a key task. To ensure efficiency, while maintaining accuracy, FarSight switches, in an adaptive manner, between DNN-based detection, which is accurate, and key point tracking, which is computationally less expensive.\r\n\r\n&nbsp;\r\n\r\nFor more information, please see the <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/farsight-a-smartphone-based-vehicle-ranging-system\/\">ACM Ubicomp 2019 paper on FarSight<\/a>.\r\n\r\n[\/panel]\r\n[panel header=\"DeepLane: Computer Vision based Lane Detection\"]\r\n\r\nStudent: Ravi Bhandari (PhD candidate at IIT Bombay; intern during summer 2016)\r\n\r\nResearchers: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/akshayn\/\">Akshay Nambi<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/padmanab\/\">Venkat Padmanabhan<\/a>, <a href=\"https:\/\/www.cse.iitb.ac.in\/~br\/webpage\/\">Bhaskaran Raman (IIT Bombay)<\/a>\r\n\r\nCurrent smartphone-based navigation applications fail to provide lane-level information due to poor GPS accuracy. Detecting and tracking a vehicle\u2019s lane position on the road assists in lane-level navigation. For instance, it would be important to know whether a vehicle is in the correct lane for safely making a turn, perhaps even alerting the driver in advance if it is not, or whether the vehicle\u2019s speed is compliant with a lane-specific speed limit.\r\n\r\nDeepLane leverages the back camera of a windshield-mounted smartphone to provide an accurate estimate of the vehicle\u2019s current lane. We employ a deep learning-based technique to classify the vehicle\u2019s lane position. DeepLane does not depend on any infrastructure support such as lane markings and works even when there are no lane markings, a characteristic of many roads in developing regions. Our analysis shows that DeepLane has an accuracy of over 90% in determining the vehicle\u2019s lane position.\r\n\r\n[video width=\"1280\" height=\"720\" mp4=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/Deep.mp4\" autoplay=\"true\"][\/video]\r\n\r\n[video width=\"1280\" height=\"720\" mp4=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/Deeplane_Abroad.mp4\" autoplay=\"true\"][\/video]\r\n\r\nFor more information, please see the <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/deeplane-camera-assisted-gps-for-driving-lane-detection\/\">ACM BuildSys 2018 paper on DeepLane<\/a>.\r\n\r\n[\/panel]\r\n\r\n[panel header=\"FullStop: Tracking Unsafe Stopping Behaviour of Buses\"]\r\n\r\nStudent: Ravi Bhandari (PhD candidate at IIT Bombay; intern during summer 2016)\r\n\r\nResearchers:<a href=\"https:\/\/www.cse.iitb.ac.in\/~br\/webpage\/\"> Bhaskaran Raman (IIT Bombay)<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/padmanab\/\">Venkat Padmanabhan<\/a>\r\n\r\nWe focus on the stopping behaviour of buses, especially in the vicinity of bus stops, which often leads to accidents. For instance, buses could arrive at a bus stop but continue rolling forward instead of coming to a complete halt, or could stop some distance away from the bus stop, possibly even in the middle of a busy road. Each of these behaviours can result in injury or worse to people waiting at a bus stop as well as to passengers boarding or alighting from buses.\r\n\r\nGPS is not accurate enough to detect such safety-related situations. Therefore, in FullStop, we use the view obtained from the rear camera of a windshield-mounted smartphone to detect safety-related situations such as a rolling stop or stopping at a location that is displaced laterally relative to the designated bus stop.\r\n\r\nFor more information, please see the <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/fullstop-tracking-unsafe-stopping-behaviour-of-buses\/\">COMSNETS 2018 paper on FullStop<\/a>.\r\n\r\n[\/panel]\r\n\r\n[panel header=\"AutoRate: Automatically Rating Driver Attentiveness\"]\r\n\r\nStudent: Isha Dua (master\u2019s candidate at IIIT Hyderabad; intern during summer 2018)\r\n\r\nResearchers: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/akshayn\/\">Akshay Nambi<\/a>, <a href=\"https:\/\/faculty.iiit.ac.in\/~jawahar\/\">C. V. Jawahar (IIIT Hyderabad)<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/padmanab\/\">Venkat Padmanabhan<\/a>\r\n\r\nDriver inattentiveness, whether due to fatigue or distraction, is a leading cause of road accidents. Prior work has evaluated fatigue and distraction independently. In AutoRate, we leverage the front camera of a windshield-mounted smartphone to monitor the driver\u2019s attentiveness holistically. AutoRate derives a driver's attention rating by fusing several spatio-temporal features pertaining to the driver\u2019s state and actions, including head pose, eye gaze, eye closure, yawns, use of mobile phone, etc. Our analysis shows that AutoRate\u2019s automatically generated rating has an overall agreement of 0.87 with the ratings provided by human annotators.\r\n\r\n[caption id=\"attachment_618234\" align=\"aligncenter\" width=\"904\"]<img class=\"wp-image-618234 size-full\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/Autorate.png\" alt=\"\" width=\"904\" height=\"877\" \/> Driver attention score derived using AutoRate[\/caption]\r\n\r\nFor more information, please see the <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/autorate-how-attentive-is-the-driver\/\">IEEE FG 2019 paper on AutoRate<\/a>.\r\n\r\n[\/panel]\r\n\r\n[panel header=\"InSight: Driver State Monitoring in Low-light Conditions\"]\r\n\r\nStudents: Ishani Janveja (BVCE college, intern during summer 2019), Shruthi Bannur (RVCE college, intern during 2017-18), Sanchit Gupta (IIIT Delhi, intern during 2017-18), Ishit Mehta (Research Fellow, 2018-19)\r\n\r\nResearchers: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/akshayn\/\">Akshay Nambi<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/padmanab\/\">Venkat Padmanabhan<\/a>\r\n\r\nRoad accidents are more common during nighttime than during daytime. However, the poor lighting condition at nighttime makes it challenging to even detect the driver\u2019s face, leave alone detect facial landmarks, using a standard RGB camera of a smartphone.\r\n\r\nIn InSight, we are developing a suite of techniques spanning special-purpose hardware and deep learning to enable effective face and facial landmark detection in low-light conditions. For instance, we have developed a variant of dlib library to accurately detect landmarks on facial images obtained with a FLIR camera.\r\n\r\n[caption id=\"attachment_618279\" align=\"alignnone\" width=\"500\"]<img class=\"wp-image-618279 size-full\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/Lowlight_FLIR_driving.png\" alt=\"\" width=\"500\" height=\"210\" \/> Low-light image captured using a smartphone and the corresponding thermal image using FLIR camera along with landmarks from our model.[\/caption]\r\n\r\nStay tuned for more details!\r\n\r\n[\/panel]\r\n\r\n[panel header=\"ALT: Automating Driver License Testing\"]\r\n\r\nStudents: Anurag Ghosh (intern\/Research Fellow, 2018 onwards), Vijay Lingam (intern, 2017-18), Ishit Mehta (Research Fellow, 2018-19)\r\n\r\nResearchers: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/akshayn\/\">Akshay Nambi<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/padmanab\/\">Venkat Padmanabhan<\/a>\r\n\r\nDriver license testing is an important step in ensuring that only qualified drivers hit the road. However, testing is typically a manual process, which imposes a significant burden on the human evaluators and therefore leads to a less-than-thorough process. It also means that candidates must contend with the possibly subjective assessment made by the evaluators. The result of these constraints can be stark. For instance, a <a href=\"http:\/\/savelifefoundation.org\/wp-content\/uploads\/2017\/07\/Road-Safety-in-India_Public-Perception-Survey_SLF.pdf\">survey by SaveLIFE Foundation<\/a> in India reports that a whopping 59% of the respondents did not give a test to obtain a driving license.\r\n\r\n[caption id=\"attachment_618627\" align=\"alignleft\" width=\"265\"]<img class=\"wp-image-618627 \" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/Gaze-300x225.png\" alt=\"\" width=\"265\" height=\"199\" \/> Auto calibration process to determine mirror scans in ALT.[\/caption]\r\n\r\nThe goal of ALT is to automate driver license testing using the standard HAMS setup --- a windshield-mounted smartphone. The front camera of the smartphone is used for a range of inward-looking tasks, including ensuring that (a) the person taking the test is the same as the one who had registered for it, (b) the driver is wearing a seatbelt, and (c) the driver scans their mirrors before effecting a turn or a lane change. To accommodate variation in the vehicle geometry, driver seating, and smartphone (and hence camera) mounting, ALT employs a <em>novel<\/em> <em>autocalibration<\/em> step to automatically learn the direction of the driver\u2019s gaze relative to the mirror positions, without requiring any manual calibration.\r\n\r\n&nbsp;\r\n\r\n[caption id=\"attachment_618630\" align=\"alignright\" width=\"315\"]<img class=\"wp-image-618630 \" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/trajectory-300x225.png\" alt=\"\" width=\"315\" height=\"236\" \/> Vehicle trajectory estimated in ALT.[\/caption]\r\n<p style=\"text-align: left\">The rear camera is used to track the trajectory of the vehicle as it is driven through various maneuvers such as parallel parking and circling a roundabout. This requires precise tracking to establish the driver\u2019s skill or lack thereof; for instance, determining whether the vehicle strayed outside the designated track or whether the driver stopped for longer than permitted or tried to course-correct by rolling their vehicle forward and backward alternately more times than is allowed. While visual SLAM (Simultaneous Localization and Mapping) is an attractive option for such tracking, existing approaches suffer from either a lack of accuracy or the need for the extensive deployment of markers in the environment. In ALT, we develop a novel <em>hybrid SLAM <\/em>technique, which requires a minimal deployment of markers only at the points in the track where, for instance, there is a significant scene change say due to a sharp curve.<\/p>\r\nFor more information, please see the <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/alt-towards-automating-driver-license-testing-using-smartphones\/\">ACM SenSys 2019 paper on ALT<\/a>.\r\n\r\nHAMS with ALT functionality enabled has been deployed for conducting driver license tests at Dehradun, Uttarakhand. See the <a href=\"https:\/\/news.microsoft.com\/en-in\/features\/microsoft-ai-automates-drivers-license-test-india\/\">public announcement of this project<\/a> in collaboration with the Transport Department, Government of Uttarakhand and <a href=\"http:\/\/idtr.in\/\">Institute of Driving and Traffic Research (IDTR)<\/a>. And here are videos introducing the project and showing automated license testing in action.\r\n\r\n[embed]https:\/\/www.youtube.com\/watch?v=XtgpWXM5Hfg[\/embed]\r\n\r\n[video width=\"1920\" height=\"1080\" mp4=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/10\/HAMS-TUTORIAL_1MBPS.mp4\"][\/video]\r\n\r\n&nbsp;\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Application: Fleet monitoring\"]\r\n\r\nStudents: Amod Agarwal (IIIT-D, intern during summer 2016), Ravi Bhandari (IITB, intern during summer 2016), Shibsankar Das (IISc, intern during summer 2016), Puneeth Meruva (MIT, intern during summer 2016), Deepak Mahendrakar, Abhishek V (PESIT, part-time intern during autumn 2016)\r\n\r\nResearchers: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/akshayn\/\">Akshay Nambi<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/padmanab\/\">Venkat Padmanabhan<\/a>\r\n\r\nMonitoring the driver and their driving is crucial to ensuring safety. One of the earliest applications we prototyped was a fleet management dashboard, which allowed a supervisor to view safety-related incidents of interest offline.\r\n\r\nhttps:\/\/youtu.be\/uO5W37t5nXU\r\n\r\n[\/panel]\r\n\r\n[\/accordion]"},{"id":1,"name":"Related items","content":"<h3>Related Projects:<\/h3>\r\n<ol>\r\n \t<li>CarTel: A Distributed Mobile Sensor Computing System (<em>MIT)<\/em><\/li>\r\n \t<li>CarSafe: A Driver Safety App that Detects Dangerous Driving Behavior using Dual-Cameras on Smartphones (<em>University of Oxford and Dartmouth College)<\/em><\/li>\r\n \t<li>SignalGuru: Leveraging mobile phones for collaborative traffic signal schedule advisory, (Princeton University, MIT)<\/li>\r\n \t<li>Driving behavior analysis for smartphone-based insurance telematics (KTH, Sweden)<\/li>\r\n<\/ol>\r\n<h3>Startups in this space:<\/h3>\r\n<a href=\"https:\/\/www.netradyne.com\/\">Netradyne<\/a>, <a href=\"https:\/\/www.samsara.com\/fleet\/dash-cam\">Samsara<\/a>, <a href=\"https:\/\/www.nauto.com\/\">Nauto<\/a>, <a href=\"https:\/\/zendrive.com\/\">Zendrive<\/a>, <a href=\"https:\/\/www.cmtelematics.com\/\">Cambridge Mobile Telematics,<\/a> and more."},{"id":2,"name":"Automated Driver License Testing","content":"<h3 style=\"text-align: center\">New - <a href=\"https:\/\/aka.ms\/hams-dashboard\" target=\"_blank\" rel=\"noopener\">HAMS Automated License Testing Dashboard<\/a> is Live!<\/h3>\r\nInadequate driver skills and apathy towards\/lack of awareness of safe driving practices are key contributing factors for the lack of road safety. The problem is exacerbated by the fact that the\u00a0license issuing system is broken in India,\u00a0with an <a href=\"http:\/\/savelifefoundation.org\/wp-content\/uploads\/2017\/07\/Road-Safety-in-India_Public-Perception-Survey_SLF.pdf\" target=\"_blank\" rel=\"noopener\">estimated 59% of licenses issued without a test<\/a>, making it a significant societal concern. The challenges arise from capacity and cost constraints, and <a href=\"https:\/\/scholar.harvard.edu\/files\/remahanna\/files\/1_qje_driving_license.pdf\" target=\"_blank\" rel=\"noopener\">corruption<\/a> that plagues the driver testing process. While there have been efforts aimed at <a href=\"https:\/\/timesofindia.indiatimes.com\/city\/delhi\/how-automated-tests-will-ensure-only-expert-drivers-are-on-road\/articleshow\/65940721.cms\" target=\"_blank\" rel=\"noopener\">creating instrumented tracks<\/a> to automate the license test, these have been stymied by the high cost of the infrastructure (e.g., pole-mounted high-resolution cameras looking down on the tracks) and poor test coverage (e.g., inability to monitor the driver inside the vehicle).\r\n\r\nHAMS-based testing offers a compelling alternative. It is a <strong>low-cost and affordable system based on a windshield-mounted smartphone<\/strong>, though for reasons of scalability (i.e., handling a large volume of tests), we can offload computation to an onsite server or to the cloud. The view inside the vehicle also helps expand the test coverage. For instance, the test can verify that the driver taking the test is the same as the one who had registered for it (essential for protecting against impersonation), verify that the driver is wearing their seat belt (an essential safety precaution), and check whether the driver scans their mirrors before effecting a maneuver such as a lane change (an example of multimodal sensing, with inertial sensing and camera-based monitoring being employed in tandem).\r\n\r\n<em>HAMS-based testing allows the entire testing process to be performed without any human intervention<\/em>. A <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/HAMS-ADLT-Dehradun.pdf\" target=\"_blank\" rel=\"noopener\">test report<\/a>, together with video evidence (to substantiate the test result in case of a dispute), is produced in an automated manner within minutes of the completion of the test. This manner of testing, with the test taken by the driver alone in the vehicle (i.e., no test inspector) has proved to be a boon in the context of the physical distancing norms arising from the COVID-19 pandemic.\r\n<h3>Deployments<\/h3>\r\nTo roll out HAMS-based driver testing, we first partnered with the Government of Uttarakhand and the <a href=\"http:\/\/idtr.in\/\" target=\"_blank\" rel=\"noopener\">Institute of Driving and Traffic Research (IDTR)<\/a>, run by Maruti-Suzuki. Testing is conducted on a track and includes a range of parameters including verification of driver identity, checking of the seat belt, fine-grained trajectory tracking during maneuvers such as negotiating a roundabout and performing parallel parking, and checking on mirror scanning during lane changing.\r\n<ol>\r\n \t<li><strong>HAMS-based driver license testing @ Dehradun, Uttarakhand:<\/strong> <a href=\"https:\/\/news.microsoft.com\/en-in\/features\/microsoft-ai-automates-drivers-license-test-india\/\" target=\"_blank\" rel=\"noopener\">HAMS-based license testing went live at Dehradun Regional Transport Office (RTO)<\/a>, the capital of Uttarakhand state in July 2019. Till date, 10000+ automated tests (as of 15 Feb 2021) have been conducted, with an accuracy of 98%. The objectivity and transparency of the automated testing process has won the praise of not just the RTO staff but also the majority of the candidates, including many that failed the test.\u00a0 The thoroughness of HAMS-based testing is underscored by the fact that now the passing rate is only 54% compared to over 90% with the prior manual testing.<\/li>\r\n \t<li><strong>Scaling HAMS deployments across India:<\/strong> \u00a0The success in Dehradun has spurred interest in HAMS-based automated testing across India and also overseas. RFPs issued by several states have called for capabilities such as continuous driver identification, gaze tracking, and mirror scan monitoring, that were not available before HAMS. HAMS-based testing has been <a href=\"https:\/\/youtu.be\/GFiHztV1vSU\" target=\"_blank\" rel=\"noopener\">rolled out in IDTR Aurangabad, Bihar<\/a>, and is in process of being implemented at multiple RTOs across the country.<\/li>\r\n<\/ol>\r\n<h3>Videos<\/h3>\r\nMicrosoft CEO, Satya Nadella, showcased HAMS as part of his keynotes at the\u00a0<strong><a class=\"\" href=\"https:\/\/youtu.be\/zFuIP5hI4yU?t=2807\">Future Decoded Mumbai CEO Summit<\/a><\/strong>\u00a0and the\u00a0<strong><a href=\"https:\/\/youtu.be\/NqVyUc52-Dg?t=2133\">Future Decoded Bengaluru Tech Summit<\/a><\/strong>\u00a0(Feb 2020)\r\n\r\n[embed]https:\/\/youtu.be\/zFuIP5hI4yU?t=2807[\/embed]\r\n\r\n&nbsp;\r\n\r\n<strong>HAMS-based License Testing Launch Video at Dehradun, Uttarakhand:<\/strong>\r\n\r\n[embed]https:\/\/www.youtube.com\/watch?v=XtgpWXM5Hfg[\/embed]\r\n\r\n&nbsp;\r\n\r\n<strong>HAMS-based License Testing Overview at Dehradun, Uttarakhand:<\/strong>\r\n\r\n[embed]https:\/\/www.youtube.com\/watch?v=0b16GBIoUE0[\/embed]\r\n\r\n&nbsp;\r\n\r\n<strong>HAMS-based License Testing at Aurangabad, Bihar<\/strong> (the commentary is in Hindi, but HAMS is spelled out at <a href=\"https:\/\/youtu.be\/GFiHztV1vSU?t=83\">1:26<\/a> and <a href=\"https:\/\/youtu.be\/GFiHztV1vSU?t=125\">2:07<\/a> in the video):\r\n\r\n[embed]https:\/\/youtu.be\/GFiHztV1vSU[\/embed]\r\n\r\n&nbsp;"}],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Akshay Nambi","user_id":38169,"people_section":"Group 1","alias":"akshayn"},{"type":"user_nicename","display_name":"Venkat Padmanabhan","user_id":33180,"people_section":"Group 1","alias":"padmanab"}],"msr_research_lab":[199562],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/320399","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":31,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/320399\/revisions"}],"predecessor-version":[{"id":862911,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/320399\/revisions\/862911"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/618282"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=320399"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=320399"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=320399"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=320399"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=320399"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}