DepressionMachines and medicine have gone hand in hand for the past couple of decades, although for the most part they are used as tools where they can scan a person’s body, but at the end of the day it’s up to the doctor to interpret the findings. Like a machine could detect spots on a person’s lungs, but it won’t know what it means.

However it seems that in the future, it is possible that with an algorithm, AIs could detect if we have depression. This is thanks to an ongoing research project called SimSensei developed by researchers at the University of Southern California. The idea is that with the use of a Kinect, it will be able to “read” a person’s body language to look out for signs that could hint at depression, like nervousness, anxiety, and so on.

It will also be able to listen to what a person is saying and based on how they say it, will be able to tell if someone is suffering from depression. Speaking to Digital Trends, one of the researchers Stefan Scherer says, “We’re focusing on aspects of speech like voice quality — from the timbre to the color of the voice: whether it’s a tense voice, a harsh voice, or a breathy voice. We want to pick up these changes and contextualize them.”

We suppose at the end of the day seeing an actual human professional trained in these sorts of things will still be a better idea, but as Scherer points out, such technology could be bundled into smartphone app where it could be used to remind people to smile more, or maybe even offer up some basic advice if it thinks that something is wrong.

Filed in Medical. Read more about and .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading