We can’t really blame Apple for that because if there isn’t that much data to work with, what can they do about it? It turns out that they can do something about it because in a post on their Machine Learning blog, Apple has revealed that with iOS 15, its Photos app will be better at recognizing people even if their faces aren’t shown completely.
According to Apple, “Faces are frequently occluded or simply not visible if the subject is looking away from the camera. To solve these cases we also consider the upper bodies of the people in the image, since they usually show constant characteristics— like clothing— within a specific context. These constant characteristics can provide strong cues to identify the person across images captures a few minutes from each other.”
It sounds interesting and promising, although since iOS 15 isn’t widely available yet as it is still in beta, we haven’t got a chance to try it out yet, but hopefully it works as intended once the update is released later this year.