The other day, Apple announced a new child safety feature that they would be introducing to iOS. This basically attempts to detect when your child uses the Messages app to send or receive sensitive images. This is meant to help protect children from online predators as well as to prevent them from possibly being manipulated or exploited.

However, it seems that this would be an exclusive feature to Messages, but apparently not. In a recent Q&A session with reports, Apple actually revealed that it would be a desirable goal if they could bring this feature to third-party apps. The company stated that while they have nothing to share today in terms of an announcement, they are open to the idea that it could somehow be implemented into third-party apps.

We’re not quite sure how that would work, but maybe it could come in the form of an API that developers could choose to use with their apps. However, whether or not apps will implement the API remains to be seen. It is possible that Apple could mandate it in order for apps to remain in the App Store, like how they’ve made it compulsory for new apps to use their privacy labels, but we reckon that approach probably won’t go down too well.

Companies like WhatsApp have already spoken out against Apple’s new CSAM photo scanning feature, so it seems unlikely that they’ll be too thrilled with the idea of adopting this for their own app.

Filed in Apple >Cellphones. Read more about , , and . Source: macrumors

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading