You would assume that because computers have no feelings that when it comes to biases, they might be immune to it. However, a computer is essentially a tool and it does what the people who program it tell it to, and as such, sometimes maybe by accident, developers can bake biasness into them.

In a recent study conducted by the US government, they have found that when it comes to facial recognition systems, they tend to have biases against people of color and gender, where they misidentify people of color more so than other Caucasian people.

According to the report, “For one-to-one matching, the team saw higher rates of false positives for Asian and African American faces relative to images of Caucasians. The differentials often ranged from a factor of 10 to 100 times, depending on the individual algorithm.” For example, the study found that Microsoft’s facial recognition tech has nearly 10 times more false positives for women of color than men of color.

Microsoft has since said that they are reviewing the report, and hopefully make tweaks to address it. This is not the first time that tech has been accused of being biased. For example, not too long ago, the algorithm behind the Apple Card was found to be offering men a higher credit limit compared to women.

Filed in General. Read more about and . Source: mobilesyrup

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading