One of the problems with the majority of video conferencing devices is that because the lens is located above the display, most of the time, it ends up like we are looking down and not into the eyes of the person we’re doing the video conference with. That’s a problem that Apple is attempting to solve with iOS 13.

Dubbed the FaceTime Attention Correction feature, this is where Apple tries to correct the video in such a way so that it looks like both parties are making eye contact with each other during the FaceTime video. The feature is currently live in the iOS 13 beta which MacRumors got to try for themselves.

According to the report, this is thanks to the use of augmented reality in which it attempts to track the user’s eyes and make it seem like they’re looking directly at the person instead of into their camera. However, it isn’t perfect just yet because during their testing, MacRumors found that the warping can be seen when you put an object in front of the camera.

We’re not sure if this will have an effect on those wearing glasses or hats, but it does show how the feature works. iOS 13 is expected to be released in the later part of the year alongside the new iPhones.

Filed in Apple >Cellphones. Read more about , and . Source: macrumors

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading