Social media platforms are plagued with bots and scammers pretending to be people they are not. Given how easy it is to create an account and upload a photo of someone else, it has become a real problem. However, Instagram could be working on a way to address that by using video selfies.

This is according to a tweet by Matt Navarra who discovered a feature in Instagram where it asks users to take a video selfie in order to confirm who they say they are. For those concerned about Instagram having access to your videos and using it for something else, the text suggests that this video will never be published and will be deleted after 30 days.

It also claims that it will not be used for facial recognition or be used to collect biometric data, which is in line with Meta’s recent announcement that they will be shutting down its facial recognition program and deleting all data.

That being said, this is apparently not the first time Instagram is trying to use video selfies to confirm the user’s identity. As XDA Developer reports, in August last year, the company was already testing out the feature but quickly rolled back the feature due to technical issues, but the appearance of this suggests that the company could have since dealt with whatever issues they previously encountered.

Filed in General. Read more about . Source: theverge

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading