Imagine if you could look at someone and see their heart beat?
Surgeons could see whether a transplanted organ/tissue has blood flowing to it.· Physical trainers would be able to see if athletes are in their “zone”.· People could increase their awareness of their physiological impact on others.
Using augmented reality we built a pair of glasses that does exactly that.
Our application runs on the Microsoft Hololens without a need for tethering the device or any additional hardware.The system combines a front-facing camera, remote imaging photoplethysmography software and a heads up display allowing users to view the physiological state of a person simply by looking at them.
We augment the appearance of the subject with the blood flow signal. Augmenting the real-world with physiological signals has key advantages as holograms can be displayed on the objects/people of interest interfering with other elements.
The heart rate of the wearer is displayed in the top of the facial region. The semi-transparent mesh is augments that skin in real-time. An optional pulse wave plot can be displayed below the facial region.Version 2.0 will allow visualization of arousal/stress from changes in the HRV parameters.
How it works
- Face tracking and skin segmentation algorithms are used to locate the regions of interest (ROIs) in in-coming frames from the front-facing camera.
- Remote imaging photoplethysmography (iPPG) is used to recover the blood volume pulse, heart rate and heart rate variability of the person you are looking at. Imaging PPG is an advanced set of computer vision methods that enables measurement using just the camera and ambient light – it does not requre calibration.
- We augment the appearance of the partner with the blood flow signal by altering the brightness of the skin on their face by displaying a semi-transparent mask. We developed a holographic overlay which shows the pulse signal superimposed onto the face using a linear image processing pipeline.
We used the Microsoft Hololens and implemented the system in C#. All the processing is performed on the device and images captured using the front-facing camera. The device does not need to be tethered to a computer. The results demonstrated good performance for capturing users’ physiological signals despite camera and head motions.
We validated Cardiolens against gold-standard contact sensor measurements. The mean absolute error was 1.62 beats-per-minute.
Using a peak detection algorithm we were able to identify the inter-beat intervals of the heart.
Presented at ACM SIGGRAPH Emerging Technologies
Selected for the SXSW Innovation Awards
Mar Gonzalez Franco