Facebook teams up with London police to stop terrorist live streams

Authorities will develop software to quickly spot and filter violent imagery

FILE - In this March 17, 2019, file photo a police officer stands guard in front of the Masjid Al Noor mosque in Christchurch, New Zealand, where one of two mass shootings occurred. Facebook will work with law enforcement organizations to train its machine learning technology to recognize videos of violent events as part of a broader effort to crack down on extremism. The move comes after its artificial intelligence did not detect the first-person video of a terrorist attack on the mosque in Christchurch. (AP Photo/Vincent Yu, File)
Powered by automated translation

Facebook and London police have teamed up to foil the live streaming of terrorist attacks, in the wake of the New Zealand mosque massacre earlier this year.

London’s Metropolitan Police said that in the initiative, it would provide Facebook with footage of training by its firearms command unit.

The videos will be captured on body cameras provided by Facebook that London's firearms command officers wear during exercises.

It will allow social media companies to gather another imagery so its artificial intelligence technology can identify and remove live footage of violent incidents.

The Met Police said its officers regularly train in a wide variety of scenarios including hostage situations, terrorist incidents and operations on land, public transport and water.

The footage will be combined with video Facebook is already using from law enforcement agencies in the US.

Australian white supremacist Brenton Tarrant murdered 51 worshippers at mosques in Christchurch, New Zealand in March 2019 – and streamed the assault on social media. The 17-minute video was shared widely and remained online for another 12 minutes before being taken down.

Facebook and platforms such as YouTube came under intense criticism for initially failing to detect the broadcast and then struggling to take down uploads that spread online.

New Zealand's Prime Minister Jacinda Ardern and other world leaders launched in May a ‘Christchurch Call to Action’ against online extremism, in a campaign that major platforms joined.

"The video of the attack in Christchurch did not prompt our automatic detection systems because we did not have enough content depicting first-person footage of violent events to effectively train our machine-learning technology," Facebook said.

“The technology Facebook is seeking to create could help identify firearms attacks in their early stages and potentially assist police across the world in their response to such incident,” said Neil Basu from the Met Police.

Officials fear live streaming terror attacks can inspire sympathisers.

“Technology that automatically stops live streaming of attacks once identified, would also significantly help prevent the glorification of such acts and the promotion of the toxic ideologies that drive them,” said Mr Basu.

Facebook said it was updating its policies for dealing with extremism and online hate.

"Some of these changes predate the tragic terrorist attack in Christchurch, New Zealand, but that attack, and the global response to it in the form of the Christchurch Call to Action, has strongly influenced the recent updates to our policies and their enforcement," the social media giant said.

Elsewhere, a group of cross-party MPs, experts and activists have called on UK police and private companies to “immediately” stop using live facial recognition for public surveillance.

Campaigners say it is unnecessarily intrusive and lacks the sufficient legal and parliamentary consideration. But authorities say it helps police identify and locate suspects and criminals who could be a threat to the public.

Live facial recognition systems have “the potential for discriminatory impact,” lack “a democratic mandate,” and are incompatible with human rights, said a statement by Big Brother Watch, the civil liberties group.

It was signed by a number of MPs, including the shadow interior minister Dianne Abbott, 25 rights, race equality and technology organisations, academics and lawyers.

It follows an investigation by Big Brother Watch, which discovered a number of secret police trials that took place last year where facial recognition systems were used in heavily populated areas of the UK.