Bumble is a dating app in which women have to make the first move. The app has introduced new artificial intelligence technology which will automatically detect unwanted NSFW images from matches. The “private” detector will use artificial intelligence to warn users about such graphic images.

The parent company of Bumble has also confirmed that its other dating apps like Badoo, Chappy, and Lumen will all get this private detector integration. This algorithmic feature has been trained by AI to capture images in real-time and then ascertain if they have nudity or other explicit sexual content.

The accuracy rate is said to be 98 percent. When there’s a match, the app will prevent these images from being uploaded to users’ profiles. If the image is being sent from one user of the app to another, it will be blurred automatically until the recipient opts into viewing it.

“We can detect anything: guns, apple, snakes, you name it,” said Andrey Andreev, the founder of the parent company of these dating apps. He also mentioned that the AI will block NSFW images of both men and women.

“The sharing of lewd images is a global issue of critical importance and it falls upon all of us in the social media and social networking worlds to lead by example and to refuse to tolerate inappropriate behavior on our platforms,” Andreev said. All messages sent starting next month will be screened by the private detector.

Filed in General. Read more about .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading