Researchers from the MIT have come up with a great way to use tiny changes in video recordings to reveal things that would otherwise be invisible to the naked eye and would require on-body sensors to show. For example, by looking at minute variations in the skin tone (due to the blood circulation) which are normally invisible to the naked eye, they can produce a new video feed that shows how fast someone’s heart is beating. Interestingly, there is no need to have a special environment to do it, and their technique also work on existing footage: they demonstrated it on a clip of Batman in which  we can see Christian Bayle’s heart beat.

The second demo showed how motion amplification of a video showing a sleeping baby could reveal whether he/she was breathing or not. This is particularly interesting since parents often wonder if the baby is doing OK. At the moment, it was not completely clear if the demos were running in real-time or not, but I don’t see why they could not. In the long run, this could even be embedded in baby-webcams and the previous demo could be integrated in a doctor’s office so that your pulse, respiratory rhythm and other data perceptible to that technology could be gathered while you are discussing with the doctor. I just love this type of non-invasive tech. Is that something that you want to see deployed? [Hats off to the New York Times for sharing the video]

Filed in Medical. Read more about and .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading