Could ‘Precrime’ Become Reality?

Return to Articles

A police force that can ‘see’ crime before it happens and burst in, just in the nick of time, to place would-be perpetrators in handcuffs and an eternal prison cell; cameras that track your every move – nowhere to run, nowhere to hide – and they’ve got the wrong man.

This is the fictional world in which ‘Precrime’ exists, the authoritarian police state in ‘Minority Report’, a short science-fiction story published in the 1950s that later became a blockbuster, and it has haunted efforts at predictive policing ever since.

But is this really the ultimate destination of predictive policing?



What is predictive policing?

Police forces around the world have trialled software that uses statistical data to ‘predict’ the likelihood and location of serious crime, so that officers may intervene.

Predictive policing software uses a machine-learning algorithm and determines patterns from statistics, using existing police data including criminal records. The software is able to determine the areas with the highest rates of violent crime and police forces have used this data to aid decisions on where to distribute officers and where to raise security.

Police in the UK want to use artificial intelligence to predict who is most likely to commit violent crime but the initiative has not been well-received by the public, who fear new developments in predictive policing will lead to invasions of privacy and an authoritarian police state such as the one depicted in ‘Minority Report’.

Kent Police was the first force in England and Wales to introduce the predictive policing system back in 2013 and ran it for five years. Officers in Kent Police reported that they had found the system helpful, but the difficulty was in proving that the system had prevented crime. The negative public scrutiny strangled efforts to improve police forces; if the system had worked to prevent crimes that would have been committed without intervention, how could it possibly have been proven? Ironically, the same paradox is discussed in ‘Minority Report’.

Between September 2010 and September 2017, the number of police officers in English and Welsh forces fell by 14%, leading to increasing workloads and pressure on officers. It was reported early this year that response times for emergency calls to the police have doubled in some forces, with the admission that delays in response times reduce the chance of solving crimes such as burglaries and robberies and give criminals more time to escape. Being a victim of crime, including non-violent crime, can drastically transform and even ruin lives. It’s clear that police forces need a solution to declining numbers and delays in response times; new developments in AI and investment in predictive policing could help to ease the pressure on officers and potentially reduce the amount of crime.



image

How could predictive policing help to reduce crime?

Identifying areas for a stronger police presence - Predictive policing software can identify areas with high rates of serious crime. The benefit of the system is that it provides an overview to police forces of where a stronger police presence and deterrent is needed and help forces decide where to distribute officers and the areas to patrol.

People are more likely to commit crime if there is a weak deterrent, and not having enough police officers available to attend emergency calls destroys victims’ trust in the police force.

Intervention - The path of crime can ruin the lives of both perpetrators and victims. AI could be used to predict both future perpetrators of violent crime as well as likely victims by analysing existing data such as criminal records, involvement with social services, location and housing status, employment status and medical data. The benefit of a system such as this would be helping police forces to identify vulnerable people who may be in a position that makes them more at risk of being a victim of crime, or at risk of offending, so that police forces can then offer early intervention in the form of social services. Early intervention could stop people from going down the wrong path, becoming homeless and even save lives.

Finding missing people and surveillance on newly released offenders - In the world of ‘Minority Report’, cameras that identify people by retina recognition makes it possible for anyone to be found whenever they take public transport or enter a shop. Technology that uses AI and facial recognition software could achieve the same end as the science-fiction film and be able to find people reported missing as soon as they enter an urban area.

At 7TG, we have developed i7ense, a highly adaptable autonomous surveillance system. One of its main capabilities is making use of long range facial recognition and AI. The device can be used in police cars and would recognise a missing person and alert official personnel if they were in range of the device. i7ense can also link up security cameras and pinpoint the location of a missing person to official personnel; this would free up police officers’ time as less officers would need to be out on patrol searching for a missing person.

Additionally, i7ense could also aid police forces in surveillance on the location and movements of a suspect or recent offender under police supervision.Nearly half of adults (48%) are reconvicted of another offence within one year of release.Despite public concerns on privacy, the high rate of reoffending suggests that communities could benefit from tighter surveillance on recently released offenders. i7ense could alert police forces to intervene before released offenders reoffend, keeping communities safe and keeping ex-offenders off the path of crime.



Dealing with potential bias

Attempts at using predictive policing systems and even just the idea of it has proven largely unpopular with the public who have concerns about trusting the technology.

Let’s be clear: technology cannot possibly be biased. Systems have to be programmed before they can be used and any bias will ultimately be a human flaw.

The best way to ensure that systems are not biased is to analyse the data beforehand to assess whether it might produce biased results and to test the technology extensively before use. Being aware of ethical concerns will force programmers to proceed with caution, which should ensure that they develop fair systems that will enhance law enforcement. Ensuring that officers have the required training to use AI systems effectively will also be a marker of its success.

We think that predictive policing can be used ethically as long as it is used in the spirit it was intended: to predict, not to automatically assign guilt.

image

Jason Siera
Sales director

Return to Articles