One of the features of a lot of photo sharing platforms is facial recognition. This is something that Apple has built into its own Photos app. For the most part, we have to say that it does a good job at picking out faces, even if it’s in a group photo, but when it comes to photos where faces are partially hidden or obscured, then that’s when it falters.

We can’t really blame Apple for that because if there isn’t that much data to work with, what can they do about it? It turns out that they can do something about it because in a post on their Machine Learning blog, Apple has revealed that with iOS 15, its Photos app will be better at recognizing people even if their faces aren’t shown completely.

According to Apple, “Faces are frequently occluded or simply not visible if the subject is looking away from the camera. To solve these cases we also consider the upper bodies of the people in the image, since they usually show constant characteristics— like clothing— within a specific context. These constant characteristics can provide strong cues to identify the person across images captures a few minutes from each other.”

It sounds interesting and promising, although since iOS 15 isn’t widely available yet as it is still in beta, we haven’t got a chance to try it out yet, but hopefully it works as intended once the update is released later this year.

Filed in Apple >Photo-Video. Read more about , and . Source: machinelearning.apple

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading