There is a lot that goes behind the scenes when it comes to flagging and detecting illegal content online to prevent them from being spread even further. Unfortunately this is a job that comes with a toll on the human psyche as it basically means that there are people whose jobs are to scan and go through tons of disturbing and traumatizing content just to make the web a safer place.

However the good news is that Google wants to help reduce the amount of trauma that people have to put themselves through, and have recently announced the launch of an AI toolkit that will help make detecting and reporting of child sex abuse images easier. Basically instead of using image hashes to try and compare it against other known images that are similar in nature, the use of AI will go one step further by being able to identify material that has never been seen before.

This means that it could detect new images which could help authorities to track down the offender, versus the current system in which people could simply be sharing old photos. It will still require humans to do a final review, but the system will only flag those that are the most likely candidates. This will help to reduce the number of images that people have to go through.

This new AI toolkit by Google will be free to use for both their corporate partners and also NGOs through Google’s Content Safety programming kit.

Filed in General. Read more about AI (Artificial Intelligence), Google and Legal.

Related Articles on Ubergizmo