
Bias in the criminal justice system: AI vs human accountability
The development of new technology means that AI is being used to counter bias in the US criminal justice system. But what about human accountability? Should the criminal justice system take responsibility for individual and institutional prejudice or hide implicit bias behind technology?
AI and bias – what can be done?
Predictive policing has frequently raised concerns about artificial intelligence (AI) and the in-built bias in algorithms, which have a negative impact on black and ethnic minority groups. However, the San Francisco District Attorney’s office plan on using AI as a way of reducing bias when deciding whether to prosecute. The innovative bias-reducing tool will come into effect in July 2019.
Designed by Alex Chohlas-Wood and the team at Stanford Computational Policy Lab, the algorithm will detect and redact words in police reports which could identify a suspect’s race, ethnicity, and socio-economic background. The identity of the police officers will also be removed to help promote impartiality.
Although the tool is in its infancy, the San Francisco DA’s office is hoping that it can be developed to counter bias at the early stages of prosecution.
However, since the existence of bias within the criminal justice system is blatantly acknowledged, then why is technology being used to bypass this problem rather than addressing the issue directly? Why is the criminal justice system not being held accountable for prejudicial behavior?
Understanding bias and prejudice
First, it is important to note that a certain amount of bias and pre-judgment is inherent in human functioning. The brain has to process such a wealth of information that short-cuts are necessary in order to categorise and respond to stimuli. However, when biases are formed on misinformation and become embedded within social, cultural, and institutional contexts, the subjects of the biases suffer greatly. This can be seen in both in the US and the UK, where evidence shows that black and ethnic minorities are being disproportionally targeted within police stop and search powers.
The SAGE Handbook of Prejudice, Stereotyping and Discrimination (2010) describes the individual subconscious, conscious, and collective expressions of prejudice as serving many functions for individuals and groups. Seeing people outside of your identified group as having negative qualities and your own group as being largely positive, gives individuals increased self-esteem and affords majority groups greater competitive advantages.
And although the targets of bias feel the consequences acutely, those holding the prejudice may not be fully aware of their own bias – making implicit bias within the criminal justice system even more difficult to tackle.
Human accountability and bias – what can be done?
Steps have been made towards addressing implicit bias in police forces across the US and the UK, but the results are not all that promising.
In 2018, the New York Police Department implemented classroom-based training to help officers identify and challenge implicit bias, preventing automatic responses that may lead to unfair targeting of black and ethnic minorities.
Such training has the potential to be successful. Research has found that when officers are encouraged to see people as individuals rather than members of a group, or when they have positive experiences with particular communities, they are more likely to view people objectively.
However, in the UK, a similar form of training implemented in 2014 has proven to be unsuccessful. The College of Policing used online and classroom-based training to teach officers about the nature of implicit bias; how to recognize their own bias; and how to challenge bias in order to create fairer policing. And yet reports in 2018 show that whilst the number of people being stopped and searched has decreased by 75% from 2010-2017, black people were 8.4 times more likely to be stopped and searched than white people.
This shows that removing, or even mitigating, implicit bias in the criminal justice system is a challenge we may not currently have the skills or knowledge to meet. And yet people are still being unfairly treated every day. Using AI to mitigate some implicit bias may be the way to go in order to reduce at least some of the injustice caused by prejudice in the criminal justice system. However, it should not replace the hard work of changing ourselves and our institutions to challenge prejudice and create fairer systems.
Subscribe to the Blog
Join for the latest tech, design and industry news straight to your inbox.

Our UX team designs customer experiences and digital products that your users will love.
Follow Us