In 2017 Microsoft Research created AirSim (Aerial Informatics and Robotics Simulation) – an open-source robotics simulation platform. From ground vehicles, wheeled robotics, aerial drones, and even static IoT devices, AirSim enabled data capture data for models without costly field operations.
Over the span of five years, the open-source AirSim research project served its purpose and is now archived in anticipation of a new aerial autonomy simulation platform. Users can still access the original AirSim code, but no further updates will be made. For more information about migrating to the new platform, please visit the GitHub repo.
Read on to learn more about the AirSim research project.
Bridging the sm-to-real gap with AirSim
Microsoft AirSim (Aerial Informatics and Robotics Simulation) is an open-source robotics simulation platform. From ground vehicles, wheeled robotics, aerial drones, and even static IoT devices, AirSim can capture data for models without costly field operations.
AirSim works as a plug-in to Epic Games’ Unreal Engine 4 editor, providing control over building environments and simulating difficult-to-reproduce, real-world events to capture meaningful data for AI models.
Machine learning has become an increasingly important artificial intelligence approach in building autonomous and robotic systems. One of the key challenges with machine learning is the need for massive data sets—and the amount of data needed to learn useful behaviors can be prohibitively high. Since a new robotic system is often non-operational during the training phase, the development and debugging phases with real-world experiments face an unpredictable robot.
AirSim solves these two problems: the need for large data sets for training and the ability to debug in a simulator. It provides a realistic simulation tool for designers and developers for seamless generation of the amount of training data they require. In addition, AirSim leverages current game engine rendering, physics, and perception computation to create accurate, real-world simulations. Together, this realism, based on efficiently generated ground-truth data, enables the study and execution of complex missions that are time-consuming and/or risky in the real-world. For example, collisions in a simulator cost virtually nothing, yet provide actionable information for improving the design of the system.
A toolbox for rapid prototyping, testing, and deployment
Building a data-driven robotic system such as the AirSim platform is not a trivial task. First, it must support a wide variety of software and hardware. Second, given the breakneck speed of innovation in hardware, software, and algorithms, it must be flexible enough to extend easily in multiple dimensions. The AirSim framework addresses these challenges by using a modular design.
The platform interfaces with common robotic platforms, such as a Robot Operating System (ROS), and comes pre-loaded with a commonly used aerial robotic model, a generic sports utility vehicle for autonomous driving simulation, and several sensors. In addition, the platform enables high-frequency simulations supporting hardware and software-in-the-loop simulations with widely supported protocols such as MavLink. Its cross-platform (Linux and Windows) and open-source architecture is easily extensible to accommodate diverse new types of autonomous vehicles, hardware platforms, and software protocols. This architecture allows users to add custom autonomous system models and new sensors to the simulator quickly.
The platform is also designed to integrate with existing machine learning frameworks, including Microsoft’s newly acquired Bonsai, to generate new algorithms for perception and control tasks. Methods such as reinforcement and imitation learning, learning-by-demonstration, and transfer learning can leverage simulations and synthetically generated experiences to build realistic models.
From perception to safe control
AirSim launched with aerial drone support and is used for applications such as precision agriculture, pathogen surveillance, and weather monitoring. These are systems that typically use a camera to perceive the world and plan and execute missions.
The platform enables seamless training and testing of cameras and other perception systems by using realistic renderings of the environment. These synthetically generated images can produce orders of magnitude more perception and control data than are possible with real-world data alone. If needed, custom sensors, such as infrared (IR), can be enabled.
Since its launch, AirSim has grown to support autonomous cars, various wheeled robots, and even static IoT devices such as camera traps and facial expression recognition. Because AirSim is a plug-in for the Unreal Engine 4 game platform, users can construct their own scenery and vehicles.
This open-source, high-fidelity physics, and photo-realistic robotic simulator can help verify control and perception software, and potentially provide certification compliance when those requirements arise. With common gaming skills matched with robotic system designers and developers, almost any creation and scenario can transfer from sim to real world with the fewest possible changes.
Photo by Scott Eklund/Red Box Pictures