A “bias mitigation tool” will now be used by San Francisco in order to automatically redact information from police reports which could identify a suspect’s race. The tool uses basic artificial intelligence techniques in order to reduce bias when people are being charged with crimes. This is meant to prevent prosecutors from being influenced by racial bias when they’re deciding whether or not to charge a suspect. San Francisco will start using this bias mitigating tool from July 1st.

It will go beyond just redacting descriptions of race automatically. It will also scrub descriptions such as hair and eye color in addition to the names of locations, people or neighborhoods which may consciously or unconsciously make the prosecutor think that the suspect belongs to a certain race.

“When you look at the people incarcerated in this country, they’re going to be disproportionately men and women of color,” said SF District Attorney George Gascón.

He highlighted that a name like Hernandez can immediately signal to prosecutors that the suspect is of Latino descent and that could end up in a biased decision. He added that this is the “first-in-the-nation” use of this technology.

A spokesperson for the DA confirmed to The Verge that this tool is going to remove details about police officers as well, such as their badge number, to prevent the chance that they’re known to the prosecutor. This will be done so that the prosecutor isn’t biased toward or against their report. The tool is only going to be used in the first charging decision in an arrest. The final decisions of prosecutors will continue to be based on the full unredacted report.

Filed in General. Read more about .