UK police to use facial recognition cameras across London

Scotland Yard are using the initiative after a rise in violent crime which has seen knife crime reach a record high

Scotland Yard is rolling out the use of facial recognition candidates in London. Reuters
Powered by automated translation

The UK's largest police force has rolled out facial recognition cameras across London to tackle violent crime.

It comes a day after official figures revealed knife crime was at a record high in the UK capital - with more than 70 knife killings in the last year.

London's knife crisis made international headlines last month with the murder of Omani student Mohammed bin Abdullah Al Araimi, 26, near Harrods, who was stabbed to death by robbers near the Knightsbridge store.

Assistant Commissioner Nick Ephgrave said the technology was a "vital" step in tackling the violent crime epidemic.
"This is an important development for the Met and one which is vital in assisting us in bearing down on violence," he said.

"As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London. Independent research has shown that the public support us in this regard. Prior to deployment we will be engaging with our partners and communities at a local level.
"We are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point."

The use of live facial recognition technology (LFT) will begin on Friday.

Police say it will be intelligence-led and deployed to specific locations in London to help tackle serious crime, including serious violence, gun and knife crime, child sexual exploitation and help protect the vulnerable.

AC Ephgrave added: "Similar technology is already widely used across the UK, in the private sector. Ours has been trialled by our technology teams for use in an operational policing environment.
"Every day, our police officers are briefed about suspects they should look out for; LFR improves the effectiveness of this tactic.
"Similarly if it can help locate missing children or vulnerable adults swiftly, and keep them from harm and exploitation, then we have a duty to deploy the technology to do this."

On Thursday, the Office of National Statistics revealed there were 15,080 knife offences in London last year - a two per cent rise on the previous year.

There were 71 knife killings, 98 attempted knife murders, 805 threats to kill with a blade and 166 rapes or sexual assaults by a knife-carrying attacker.
The technology, from NEC, will be deployed at specific locations where officers believe they are "most likely to locate serious offenders".

Each deployment will have a bespoke "watch list", made up of images of wanted individuals, predominantly those wanted for serious and violent offences.
At a deployment, cameras will be focused on a small, targeted area to scan passers-by.

The technology, which is a standalone system, is not linked to any other imaging system, such as CCTV or body worn video cameras.
AC Ephgrave said: "We all want to live and work in a city which is safe: the public rightly expect us to use widely available technology to stop criminals.
"Equally I have to be sure that we have the right safeguards and transparency in place to ensure that we protect people's privacy and human rights. I believe our careful and considered deployment of live facial recognition strikes that balance."

The technology was first used in London at the Notting Hill Carnival in 2016 and has been used at least nine times in London.

Last year two academics from the University of Essex raised concerns after analysing the use of the technology about how it was being deployed and who was being added to the watch lists.

The research revealed that the cameras were only correct 19 per cent of the time in the trials they attended.

The watch lists are changed depending on the location of the cameras and the researchers questioned how up to date they were.

"Ensuring accurate and up-to-date information from across these different data sources posed a significant challenge," they write.

"Such difficulties made compliance with overall standard of good practice complex."

A number of court cases have been brought against forces using the technology.

Last year a high court in Cardiff ruled it was legal.

The judges found that although automated facial recognition amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate.