Over the past few years, virtual reality has experienced a remarkable resurgence. Fueled by a proliferation of consumer-level head-mounted display and motion tracking devices, an unprecedented quantity of immersive experiences and content has become available for both desktop and mobile VR platforms. However, the problem of locomotion – human movement through a virtual world – remains a significant practical challenge. Many of the VR applications available to date require seated use or limit body movement within a small area, instead relying a gamepad or mouse/keyboard for movement within the virtual environment. Lacking support for natural walking, these virtual locomotion mechanisms do not fully replicate the physical and perceptual cues from the real world and subsequently often fall short in maintaining the illusion that the user has been transported to another location. In this talk, I will introduce a number of perceptual illusions that can overcome the spatial limitations imposed by the real world. This approach, known as redirected walking, has stunning potential to fool the senses. I will present a series of perceptual experiments that have convinced users that they were walking along a straight path while actually traveling in a circle, or that the virtual environment was much larger than it actually was. Additionally, I will discuss algorithmic approaches that leverage these illusory techniques for the dynamic exploration of arbitrary virtual environments, thus enabling the creation of systems that can automatically steer users away from the boundaries of the physical space while walking through a potentially infinite virtual world.