UK faces 'dystopian' future with facial-recognition AI cameras
UK faces ‘dystopian’ future with facial-recognition AI cameras turning public spaces into ‘open air prisons’, warn privacy campaigners over new CCTV guidance given to police and councils
- They slammed proposed new CCTV guidance given to police and local councils
- Former CCTV watchdog Tony Porter has branded the latest changes ‘bare bones’
- Campaign groups also blasted the tweak called for the practice to be scrapped
- Home Office hit back and said it empowered police and maintained public trust
Britain faces a ‘dystopian’ future with facial-recognition AI cameras turning public spaces into ‘open air prisons’, privacy campaigners have warned.
They slammed proposed new CCTV guidance given to police and councils in England and Wales to compare camera footage with a watch-list.
Former CCTV watchdog Tony Porter branded the changes to the Surveillance Camera Code of Practice ‘bare bones’ and said they offered unclear guidance.
Two campaign groups also blasted the tweak – the first in eight years – and called for the practice to be scrapped.
But the Home Office hit back, saying it empowered police and maintained public trust.
The new code would be used by the local council and police and says it would take into account any impact on protected groups.
It said that it would be justified and proportionate and would delete unused biometric data after an authorisation process.
It added it would publish the categories of those traced on the watch-list and why they would use the technology.
Privacy campaigners slammed proposed new CCTV guidance given to police and councils in England and Wales to compare camera footage with a watch-list (file photo)
Mr Porter, the former Surveillance Camera Commissioner, said the proposals were ‘bare bones’.
He told the BBC: ‘I don’t think it provides much guidance to law enforcement, I don’t really it provides a great deal of guidance to the public as to how the technology will be deployed.’
The chief privacy officer for a facial-recognition supplier said it would not cover huge firms such as Transport for London but would small councils.
Liberty lawyer Megan Goulding said: ‘One year since our case led the court to agree that this technology violates our rights and threatens our liberty, these guidelines fail to properly account for either the court’s findings or the dangers created by this dystopian surveillance tool.
‘Facial recognition will not make us safer, it will turn public spaces into open-air prisons and entrench patterns of discrimination that already oppress entire communities.’
Campaigners say the use of the technology (file image) is a step too far towards a police state
Six steps behind facial recognition technology
The Metropolitan Police uses facial recognition technology called NeoFace, developed by Japanese IT firm NEC, which matches faces up to a so-called watch list of offenders wanted by the police and courts for existing offences.
Cameras scan faces in its view measuring the structure of each face, creating a digital version that is searched up against the watch list.
If a match is detected, an officer on the scene is alerted, who will be able to see the camera image and the watch list image, before deciding whether to stop the individual.
Big Brother Watch added the technology should be scrapped and the code legalised ‘invasive surveillance technology’.
Earlier this year, figures showed facial recognition cameras scanned 13,000 people’s faces but only helped make one arrest.
The Met Police used facial recognition technology three times in London last year – but campaigners said it was ‘dangerously inaccurate and a waste of public money’.
During a search for a suspect in Oxford Circus, cameras scanned 8,600 faces against a list of 7,200 suspects.
Eight people were singled out by the technology, but seven of them were innocent, while a woman, 35, was arrested in connection with a serious assault.
In February 2020 the cameras scanned 4,600 faces against a list of 6,000 people who were wanted in Stratford, East London. Not a single suspect was identified.
The technology was used again in Oxford Circus in February, but the operation was stopped due to a technical fault after scanning an unknown amount of faces.
Facial recognition software works by measuring the structure of a face, including the distance between features such as the eyes, nose, mouth and jaw.
Police are alerted if a match is found, and officers then decide whether to speak to the potential suspect.
Meanwhile last August the Court of Appeal ruled facial recognition technology deployed by South Wales Police interfered with privacy and data protection laws.
Civil rights campaigner Ed Bridges, 37, brought a legal challenge against South Wales Police arguing their use of automatic facial recognition caused him ‘distress’.
Civil rights campaigner Ed Bridges (pictured), 37, brought a legal challenge against South Wales Police arguing their use of automatic facial recognition caused him ‘distress’
He had his face scanned while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018.
Three Court of Appeal judges ruled the force’s use of AFR was unlawful, allowing Mr Bridge’s appeal on three out of five grounds he raised in his case.
In the judgment, the judges said there was no clear guidance on where AFR Locate could be used and who could be put on a watch-list.
It ruled ‘too much discretion is currently left to individual police officers’.
In October it was revealed there were around 1,000 AI scanners monitoring social distancing between pedestrians and cyclists – having originally been installed to monitor traffic in the Capital.
The Home Office said the proposed changes to the Surveillance Camera Code of Practice empowered police and kept the public safe.
A spokesman said: ‘The Government is committed to empowering the police to use new technology to keep the public safe, whilst maintaining public trust, and we are currently consulting on the Surveillance Camera Code.’
‘In addition, College of Policing have consulted on new guidance for police use of LFR in accordance with the Court of Appeal judgment, which will also be reflected in the update to the code.’
HOW DOES FACIAL RECOGNITION TECHNOLOGY WORK?
Facial recognition software works by matching real time images to a previous photograph of a person.
Each face has approximately 80 unique nodal points across the eyes, nose, cheeks and mouth which distinguish one person from another.
A digital video camera measures the distance between various points on the human face, such as the width of the nose, depth of the eye sockets, distance between the eyes and shape of the jawline.
A different smart surveillance system (pictured) can scan 2 billion faces within seconds has been revealed in China. The system connects to millions of CCTV cameras and uses artificial intelligence to pick out targets. The military is working on applying a similar version of this with AI to track people across the country
This produces a unique numerical code that can then be linked with a matching code gleaned from a previous photograph.
A facial recognition system used by officials in China connects to millions of CCTV cameras and uses artificial intelligence to pick out targets.
Experts believe that facial recognition technology will soon overtake fingerprint technology as the most effective way to identify people.
Source: Read Full Article