Criminal Liability For Algorithmic Bias In Predictive Policing Systems In China

Case 1: Algorithmic Bias Leading to Discriminatory Police Stops

Facts:
A predictive policing system used by the Guangzhou Public Security Bureau has flagged individuals from certain neighborhoods (mainly migrant worker communities) as “high-risk” for potential criminal activity based on statistical patterns. A police officer, acting on the data provided by the system, stops a migrant worker, Li Wei, and conducts an unwarranted search. Li Wei is detained overnight but later released without charges.

Legal Issues:

Abuse of Power (刑讯逼供罪, Abuse of Authority): The officer acted on algorithmic recommendations that unfairly targeted a specific group of people, without any tangible evidence of Li Wei’s involvement in criminal activity.

Right to Privacy (隐私权): The individual’s personal information (travel history, residency status) was used to make predictions, without his consent or any judicial oversight.

Discriminatory Treatment: The algorithm showed a pattern of racial or socio-economic bias, disproportionately flagging certain migrant populations as “high-risk” without individualized suspicion.

Criminal Liability:
In this case, the police officer could be held criminally liable for abuse of power if it is found that the officer used the algorithm to conduct unwarranted searches and detentions based on flawed or discriminatory data. The algorithm's bias, combined with the officer's failure to exercise proper judgment, could lead to criminal negligence. Additionally, the designers of the algorithm could be investigated for potential violations of data protection or discriminatory practices under Chinese law.

Outcome:

Police Officer: Could face charges of abuse of authority (滥用职权罪) if it is proven that the officer failed to assess the legality of the algorithm’s recommendations and violated Li Wei’s rights.

System Designers: Developers of the predictive policing system could face disciplinary action or civil liability for failing to address known biases within the system, and if the algorithm caused disproportionate harm to a specific group, they could face criminal negligence charges under China’s Criminal Law.

Case 2: False Arrests Due to Misleading Algorithmic Risk Scores

Facts:
In the city of Beijing, the police use a predictive algorithm that assesses the likelihood of repeat offenses. The system flags Zhao Jun, a 32-year-old factory worker with a criminal record for petty theft, as a high risk for committing future crimes. The algorithm recommends increasing surveillance, leading to an arrest based on a flawed risk prediction. Zhao is detained for two days before being released without charges.

Legal Issues:

Wrongful Detention (非法拘禁罪): Zhao was detained based on a false positive from the predictive algorithm. The algorithm’s risk assessment did not account for his rehabilitation efforts and social reintegration.

Failure to Conduct Proper Investigation: The police acted on the system’s risk score without verifying the information or seeking corroborating evidence.

Negligence in Algorithmic Design: The designers of the predictive tool failed to ensure the system did not disproportionately flag individuals who had already served their time and were no longer involved in criminal activity.

Criminal Liability:

Police Officers: Could face criminal liability for wrongful detention under Chinese law (非法拘禁罪) if it is found that the police acted without reasonable grounds and relied solely on an unreliable or biased algorithmic score.

System Developers: If the algorithm's flaw was widespread and led to wrongful arrests, the developers could face charges for negligence or failure to ensure accuracy and fairness in predictive assessments, especially if they were aware of these issues and failed to address them.

Outcome:

Police Officer: Likely to face disciplinary measures or criminal charges for failure to verify the algorithm's findings before taking action, leading to a wrongful detention.

Algorithm Designers: May face civil suits for negligence or be required to amend the algorithm to ensure that it accurately represents factors affecting recidivism, rather than relying on biased predictions.

Case 3: Predictive Policing Algorithm Leads to Racial Profiling and Injustice

Facts:
The police department in Shanghai implements a predictive policing system designed to identify neighborhoods at high risk for violent crime. The system disproportionately flags ethnic minority groups (e.g., Uyghurs, Mongolians) for surveillance and police stops, based on historical crime data linked to these communities. A group of Uyghur shopkeepers is harassed and wrongfully searched based on these algorithmic predictions, despite no evidence of criminal activity.

Legal Issues:

Discrimination Based on Race or Ethnicity (种族歧视罪): The system systematically flags individuals from specific ethnic backgrounds, potentially violating principles of equality and nondiscrimination guaranteed by the Chinese Constitution.

Abuse of Police Power (滥用职权罪): The police officers involved rely on a biased system without questioning its results, engaging in discriminatory practices.

Violations of Human Rights (侵犯人权): The discriminatory targeting of ethnic minorities could be seen as a violation of the right to equal protection before the law and may lead to public outcry.

Criminal Liability:

Police Officers: Officers may be held criminally liable for abuse of power or discrimination if they are found to have acted on the biased predictions without considering the possibility of racial or ethnic bias in the algorithm.

Algorithm Designers: The developers of the predictive system could face criminal prosecution for discriminatory practices under Chinese law, especially if it can be shown that the system was knowingly designed or allowed to operate with inherent racial bias.

Outcome:

Police Officers: May face disciplinary action or criminal charges for acting on biased data that led to wrongful harassment and discrimination of ethnic minorities.

Algorithm Designers: Could be held criminally liable for creating or allowing a system that unfairly targets certain racial groups, especially if the system’s racial bias is found to be systemic and not rectified.

Case 4: Algorithmic Decision-Making Leads to Unlawful Mass Surveillance

Facts:
A smart city project in Chongqing uses predictive policing to track potential suspects. The algorithm processes vast amounts of data from surveillance cameras, social media, and travel records to flag individuals deemed likely to engage in criminal behavior. Zhao Hong, a local journalist, is flagged as a “high-risk” individual based on patterns that appear to suggest political activism, though she has never been involved in criminal activity. She is placed under surveillance and subjected to unwarranted questioning.

Legal Issues:

Violation of Privacy (侵犯隐私权): Zhao’s privacy is violated when she is surveilled based solely on algorithmic predictions that lack sufficient proof of criminal intent.

Excessive Surveillance: The authorities are using the algorithm to track individuals without cause or judicial oversight, possibly infringing on Zhao’s constitutional rights to privacy and freedom of expression.

Lack of Judicial Oversight: The system operates without sufficient checks and balances, leading to potential abuse of power and rights violations.

Criminal Liability:

Police Officers: Could face charges for unlawful surveillance (非法监视罪) or abuse of power (滥用职权罪) if it is proven that they used the predictive system to infringe upon Zhao’s rights without following proper legal protocols.

Algorithm Designers: If the surveillance system’s design disproportionately targets certain individuals (e.g., journalists, activists), the designers could face criminal liability under Chinese law for failure to implement safeguards against wrongful surveillance.

Outcome:

Police Officers: Likely to face disciplinary actions or criminal prosecution for illegally surveilling Zhao without legal justification.

Algorithm Developers: May face charges for failing to ensure that the surveillance system was used responsibly, with appropriate checks on human rights.

Case 5: Algorithmic Errors Lead to False Conviction in Criminal Case

Facts:
A predictive policing algorithm used by the Henan Province police incorrectly flags Wang Lei, a factory worker, as a high-risk for involvement in organized crime. The police use this information to conduct a series of surveillance operations against him, which eventually lead to his wrongful arrest and conviction for a crime he did not commit. Evidence later reveals that the algorithm's prediction was based on a flawed dataset that linked his neighborhood to crime, despite Wang's complete lack of involvement in criminal activity.

Legal Issues:

Wrongful Conviction (冤假错案): Wang Lei’s conviction was based on flawed, biased data. The algorithm failed to account for the fact that many individuals from the same neighborhood were merely poor and working-class, not criminals.

Failure to Ensure Fair Trial: The algorithm’s findings influenced judicial decisions, possibly infringing on Wang Lei’s right to a fair trial.

Data Mismanagement: The police relied on a biased algorithm without validating its accuracy or ensuring that it was properly tested for fairness.

Criminal Liability:

Police Officers: Could face charges for misuse of power or negligence in relying on a biased algorithm that led to a wrongful conviction.

Algorithm Developers: If the algorithm’s design or data set was flawed, the developers could be criminally liable for negligence or failure to meet accuracy standards required by the public safety system.

Outcome:

Police Officers: Likely to face charges for wrongful arrest and conviction, particularly if they failed to follow standard procedures to verify the algorithm’s predictions.

Algorithm Designers: Could face legal repercussions if they failed to ensure that the algorithm was free from bias and accurately reflected the reality of criminal activity.

Conclusion:

These are hypothetical scenarios where criminal liability for algorithmic bias in predictive policing could be explored under Chinese criminal law. Although current case law in China does not yet fully address algorithmic bias in predictive policing, these cases highlight potential issues of discrimination, wrongful detention, and abuse of power that could arise as predictive technologies become more widely used in law enforcement. As China continues to develop its regulatory framework around artificial intelligence and data protection, the legal accountability of public security officers and algorithm designers will become an increasingly important issue.

LEAVE A COMMENT