Apple’s recently announced CSAM scanning feature has caused a lot of controversy. For a company that prided themselves on customer privacy, this decision seems to be going against that. This has led to many protesting it. Apple has since published an FAQ in which they hope that it will answer some of the questions users might have.

One of those concerns that people have over the feature is that governments might soon ask Apple to start scanning for other types of imagery that aren’t CSAM. Some people have voiced out that this tool has the potential to scan for other images like political opponents, certain ethic or minority groups that face persecution in their countries, and so on.

According to Apple, “Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.”

The company adds, “Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.”

Of course, it remains to be seen how long Apple will hold out on these claims, but for now, hopefully it will assuage some of your concerns.

Filed in Apple >Cellphones. Read more about , , and . Source: apple

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading