Will we see an end to predictive policing?
Organisations like Amnesty International and Liberty are challenging the police for using questionable ethics and big data to predict criminality. But, will these challenges bring an end to predictive policing?
Predictive policing involves using big data and algorithms to forecast and possibly prevent crime.
This is achieved by analyzing data sets for information about past crimes which includes but is not limited to: geographical location, details about perpetrators, and information about victims.
Computer software can find patterns within this massive amount of data and give predictions of when similar crimes may be committed, where, and by who.
However, organisations like Amnesty International and Liberty have raised concerns over racial profiling, misuse of data, and poor implementation of findings.
So, what does this mean for the future of predictive policing?
In 2013, a RAND Corporation report examined the state of predictive policing and found that, whilst the technology theoretically has potential, it relies heavily on skillful analysis by its human users in order to be most effective.
The report suggests that skillful handling of data should involve processing and analysing the information at the point of collection and then again before entering into the program, followed by an additional analysis of the predictive results.
From that point, a decision must be made on how to act on the findings, followed by an analysis of policing actions and their impact.
This reliance on human accuracy, analysis, and accountability may be where this technology is falling short.
In November 2018, the BBC reported on how Kent Police stopped using predictive policing technology 5 years after becoming the first police force in England and Wales to trial the program.
Chris Carter, Kent Police Federation Chairman, told the BBC that the software helped as a preventative tool, allowing officers to patrol high-risk areas.
However, Carter stressed that officers did not have time to use the software due to lack of resources and the escalation of crime.
This is a crucial point, given that the RAND report stresses the importance of the time-consuming analysis of data throughout the predictive process and after implementation.
Amnesty International have raised concerns that instead of predicting and preventing crime, the technology is targeting black, Asian, and ethnic minority males in ways that have long term consequences for their housing, education, and employment opportunities.
Amnesty have found that data being gathered is not being thoroughly analysed for its relevance.
Music preference, YouTube videos, and even gang victimisation have been used as indicators for gang membership and criminality.
Once classed as being in the Gang Matrix, this information is being shared with schools, Job Centres, and housing associations.
Liberty released a report in February 2019, highlighting the fact that preexisting prejudices can be programmed into algorithms.
Since the programs are commercially purchased, Liberty claim the police have no way of knowing what forms of discrimination the algorithms may deploy, as the algorithms themselves are considered trade secrets.
So, where does this leave predictive policing?
The future of predictive policing
Kent Police were using PredPol, a US based company specialising in predictive policing programs.
PredPol is taking the lead in this area, with a Smithsonian article claiming that at least 60 police forces in the US are using their program.
And yet, despite predictive policing concerns from organisations like Liberty and Amnesty International, PredPol was an invited speaker at the United Nation’s conference Artificial Intelligence and Robotics: Reshaping the Future of Crime, Terrorism & Security on April 2, 2019.
With such prestigious involvement, it appears that PredPol and its predictive policing program is here to stay.
However, for predictive policing to be effective, police forces need to have the time, expertise, and commitment to use this technology to protect citizens and not target them for discrimination.