It’s not uncommon for complex technologies such as facial recognition systems to have false positive rates but they’re ideally very low. Such a system would be very ineffective if it had a false positive rate of 90 percent but that’s precisely what the Welsh police’s system has.

The South Wales Police tested its facial recognition program during the Champions League Final in Cardiff, Wales last year. The system was designed to match attendees against a database of 500,000 images of persons of interest. The Guardian now reports that while the system yielded 2,470 potential matches, 2,297 of those matches were later discovered to be false positives.

Having 2,470 criminals at an event attended by more than 170,000 people is certainly not an ideal situation for a law enforcement agency so the number would probably have had the police do a double check. It turned out that the facial recognition system went overboard in trying to spot potential criminals as it wrongly identified 2,297 people. That works out to a false positive rate of 92 percent.

Additional data obtained by Wire revealed that the same system yielded a 90 percent false positive rate at a boxing match last year. At a rugby match, it yet again performed spectacularly worse, by yielding a false positive rate of 87 percent.

“Of course no facial recognition system is 100 percent accurate under all conditions. Technical issues are normal to all face recognition systems which means false positives will continue to be a common problem for the foreseeable future,” the South Wales Police department said in a statement, adding that since it introduced this system no individual has been arrested where a false positive alert led to a police intervention. It also added that no members of the public have complained.

Filed in General. Read more about .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading