Countries must do more to combat racial profiling, UN rights experts said Thursday, warning that artificial intelligence programs like facial recognition and predictive policing risked reinforcing the harmful practice.
Facial recognition Illustration: VCG
Racial profiling is not new but the technologies once seen as tools for bringing more objectivity and fairness to policing appear in many places to be making the problem worse.
"There is a great risk that [AI technologies will] reproduce and reinforce biases and aggravate or lead to discriminatory practices," Jamaican human rights expert Verene Shepherd told AFP.
She is one of the 18 independent experts who make up the UN Committee on the Elimination of Racial Discrimination (CERD), which on Thursday published guidance on how countries worldwide should work to end racial profiling by law enforcement.
The committee, which monitors compliance by 182 signatory countries to the International Convention on the Elimination of All Forms of Racial Discrimination, raised particular concern over the use of AI algorithms for so-called predictive policing and risk assessment.
The systems have been touted to help make better use of limited police budgets, but research suggests it can increase deployments to communities which have already been identified, rightly or wrongly, as high-crime zones.
"Historical arrest data about a neighborhood may reflect racially biased policing practices," Shepherd warned.
"Such data will deepen the risk of over-policing in the same neighborhood, which in turn may lead to more arrests, creating a dangerous feedback loop."
When artificial intelligence and algorithms use biased historical data, their profiling predictions will reflect that.
"Bad data in, bad results out," Shepherd said.
"We are concerned about what goes into making those assumptions and those predictions."
The CERD recommendations also take issue with the growing use of facial recognition and surveillance technologies in policing.
Shepherd said the committee had received a number of complaints about misidentification by such technologies, sometimes with dire consequences, but did not provide specific examples.
The issue came to the forefront with the wrongful arrest in Detroit earlier in 2020 of an African-American man, Robert Williams, based on a flawed algorithm which identified him as a robbery suspect.
Various studies show facial recognition systems developed in Western countries are far less accurate in distinguishing darker-skinned faces, perhaps because they rely on databases containing more white, male faces.
"We have had complaints of such misidentification because of where the technologies are coming from, who is making them, and what samples they have in their system," Shepherd said.
"It is a real concern."
CERD is calling for countries to regulate private companies that develop, sell or operate algorithmic profiling systems for law enforcement. Countries have a responsibility to ensure that such systems comply with international human rights law, it said, stressing the importance of transparency.
Newspaper headline: Law enforcement needed in regulation