Apple Will Be Delaying The Rollout Of Their Controversial CSAM Scanning Feature

Back in August, Apple made an announcement in which they would be rolling out a controversial feature that would scan photos for child abuse. We say controversial because while scanning and detecting child abuse is important and a good thing, many have expressed their concern that this tool could be abused by governments to spy on its citizens, the opposition, dissidents, and more.

Despite the backlash, Apple seemed to be pushing ahead with the feature anyway and tried to justify its existence and to reassure the public that it will not be used for anything else. However, the company has since had a change of heart. In a statement made by Apple, they have announced that they will be delaying the rollout of the feature.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

It does not mean that it’s cancelled but rather it will be delayed, although it is too early to tell what kind of changes they’ll be making that will make it an easier pill to swallow. The feature was originally meant to be pushed out together as part of iOS 15 and macOS Monterey, but it is now unclear when it will be released.

You May Also Like

Related Articles on Ubergizmo

Popular Right Now

Exit mobile version

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading

Exit mobile version