Artificial intelligence will take on the gruelling task of scanning for images of child abuse on suspects’ phones and computers so that police officers are no longer subjected to psychological trauma within “two to three years”.
The Metropolitan Police’s digital forensics department, which last year trawled through 53,000 different devices for incriminating evidence, already uses image recognition software but it is not sophisticated enough to spot indecent images and video, Mark Stokes, the Met’s head of digital and electronics forensics, told the Telegraph.
“We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans,” he said.
“You can imagine that doing that for year-on-year is very disturbing.”
The force is currently drawing up an ambitious plan to move its sensitive data to cloud providers such as Amazon Web Services, Google or Microsoft, Mr Stokes said.
This would allow specialists to harness the tech giants’ massive computing power for analytics. The Met currently use a London based data centre but the sheer volume of images along with the popularity of high resolution video is putting pressure on resources.
With the help of Silicon Valley providers, AI could be trained to detect abusive images “within two-three years”, Mr Stokes said.
The Met’s digital forensics team uses bespoke software that can identify drugs, guns and money while scanning someone’s computer or phone. But it has proven problematic when searching for nudity. “Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” Mr Stokes said.
“For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin colour.”
Handing this work over to computers could save forensics specialists who spend their career trawling through pictures from psychological strain.
But the mammoth task of moving the Met’s data into the cloud is a legal minefield due to the sensitive nature of the files the force stores.
Police staff are granted consent from the courts to store criminal images, but it is an offence for anyone else – including Amazon, Microsoft or any cloud provider to store them. Providers would be taking on an incredible risk associated with storing this material.
Storing data in the cloud is a controversial move thanks to a series of high profile hacks, including a widespread Apple cloud breach where several celebrities’ personal photos were stolen and distributed on the web.
But Mr Stokes said that despite concerns, the likes of Google and Amazon might be best placed to keep police information watertight thanks to their huge profits, which unlike government departments, can be invested in talent, expertise and ensuring they are using the most advanced technology.
With those concerns in mind, Mr Stokes said providers have offered some solutions, which are currently being written into a potential IT plan.
“We have been working on the terms and conditions with cloud providers, and we think we have it covered,” he added.
It’s just the latest in science fiction-esque additions to the police force. It emerged in October that robocops may replace British bobbies on the streets.
VISIT THE SOURCE ARTICLE