A combination of new controls makes it easier for users to interact through natural computing technology. These include “push” to select virtual objects, “grip” to pan and scroll, recognition of up to four hands simultaneously, and updated Human Interface Guidelines. Kinect Interactions help ensure a highly consistent customer experience from application to application, save developers and businesses time and money, and allow you to focus on the unique problems that you are working to solve.
Kinect Windows Presentation Foundation Controls
New Windows Presentation Foundation (WPF) controls make it easy for developers to build high-quality, interactive Kinect for Windows applications.
Interaction Stream enables new functionalities when designing Kinect for Windows applications, including:
Grip Recognition, including the ability to map hand gestures to on-screen cursors. The Kinect for Windows sensor can recognize up to four hand pointers. This allows two people to interact with both hands simultaneously and enables developers to create more complex interactions, including the ability to “zoom.”
Physical interaction zone, a defined area within which a user can contain their movements to interact with the Kinect for Windows sensor most effectively, similar to the physical area in which you would perform sign language.
Updated Human Interface Guidelines
The Human Interface Guidelines provide best practices for Kinect for Windows development and design. The document shares strong designs that have been tried and tested, as well as guidance on how to avoid difficulties that may hinder your success. The guidelines have been updated to reflect best practices for the new interactions and controls. Check out the InteractionGallery-WPF sample to see some of these guidelines in practice.