Benjamin Bestgen: Predicting crime



Benjamin Bestgen

Predictive policing is no longer just science fiction, as Benjamin Bestgen explains. See his last jurisprudential primer here.

Philip K. Dick’s Minority Report is a short story probably better known through its movie adaptation: three mutants with the ability to foresee crimes before they happen are hooked up to a computer system. Their visions are interpreted by computers and given to the ‘PreCrime’ police unit that tracks down and arrests the would-be perpetrators before they commit their crime.

The rationale for this system is that threat of punishment rarely deterred offenders. It is also of little comfort to a rape or murder victim that the perpetrator was caught after the deed. If crimes can be predicted accurately, offenders-to-be can be stopped, counselled, treated or placed into safeguarding measures. Preventing crime is also cheaper than having to investigate, prosecute, incarcerate and/or rehabilitate and parole, not to mention the financial and non-financial costs suffered by the victims of crime, their families and wider society.

Attempts to predict crime and criminals are not new. The pseudoscience of phrenology sought to pinpoint criminal predispositions by analysing people’s skull structure and speculating about criminal traits in the brain.

Another pseudoscience, physiognomy, was employed by criminologist Cesare Lombroso who tried to define the physical features of “born criminals”, so criminal-looking individuals might be dealt with before they acted according to their nature.

Unsurprisingly, neither method was successful.

Predictive policing

Policework is often portrayed as reactive: an offence must have been committed or be in progress before officers can act.

But predictive policing describes a variety of analytical methods which try and decrease the risk of crime by anticipating it, making prevention easier and use of police resources more efficient.

Predictive policing uses software informed by data mining and big data analysis to identify potential future hotspots of criminal activity through crunching numbers on individual, socio-economic, environmental and demographic developments. For instance, school holiday season increases the risk of burglaries in some neighbourhoods, good weather at a popular beach with many tourists may increase theft and a pub releasing crowds of drunks into the night carries risks of assault and public disturbance.

By analysing information like prior arrests, convictions, social networks, employment status, age, gender, postcode, credit status or drug use, police can also consider the probability of individual offending. Victimisation patterns, such as having been the victim of repeated robberies, assaults or working in locations or professions more likely to experience crime can help police to anticipate victims and develop risk profiles for individuals.

Sounds good?

Predictive policing is operational or being tested in several countries, including the US and Britain. The BBC noted that in 2019 at least 14 UK police forces used or tested predictive software to analyse future crime hotspots and potential offenders. Additional surveillance and identification measures like facial recognition technologies are likewise experimented with, and not only by police. The private security sector utilises recognition and prediction tools e.g. in shopping centres, airports or risk-profiling for insurance contracts.

Legally and philosophically, predictive policing raises questions:

  • How is the right to individual privacy balanced against the public interest to live in safe and crime-free environments?
  • Can or should the police or other state authorities be trusted with access to significant amounts of personal data for predictive analyses? What safeguards against abuse, e.g. for political purposes, are in place?
  • Does predictive policing undermine the presumption of innocence by risk-rating (pre-criminalising?) “probable offenders”? Could it endanger the mental health of “probable victims” if police checks up more on “at risk” individuals?
  • Does predictive policing further entrench existing biases and discrimination? Any data used to make predictions risks perpetuating the biases inherent in the data or data selection, including biases against ethnic or religious groups.
  • Locations predicted to be criminal hotspots will look less attractive for investment in community infrastructure and business, thereby entrenching existing inequalities and depriving communities of chances to improve.
  • Overreliance on algorithms blurs responsibility for policing decisions, particularly where officers don’t fully understand the algorithms, the theoretical bases on which they function or how exactly to interpret the analytic results. If policing decisions are wholly or predominantly determined by predictive software, it could arguably reduce the personal responsibility of individual police-officers for their actions and reasoning.

Policing is not an easy job. Officers often see their fellow humans at their lowest, most desperate, most aggressive, vulnerable and irrational moments. The police’s own physical and mental health is routinely in danger. Police forces are often underfunded and officers not as well trained, paid or supported as we could wish for. Other police forces get lots of funding but it flows mainly into weapons and practical equipment, not into salaries, psychological support or well-rounded training.

Predictive software might make a hard job a little easier. But as Minority Report demonstrated, crime prediction is not an exact science, with room for errors, misinterpretation, manipulation and “false positives”.

Furthermore, policing requires trust in the community that the police will use its enforcement and investigative powers with restraint, consideration, fairness and wisdom. Without such virtues, predictive policing risks becoming a much more sophisticated and sinister form of physiognomy: algorithms fed with biased data produce biased results and lead to biased use of force by agents of the state against people pre-determined to be placed on the wrong side of the law.

Benjamin Bestgen is a solicitor and notary public (qualified in Scotland). He also holds a Master of Arts degree in philosophy and tutored in practical philosophy and jurisprudence at the Goethe Universität Frankfurt am Main and the University of Edinburgh.



Related posts