Public deployment of facial recognition technology is turning out to be quite controversial in Europe. Notwithstanding Google’s and Alphabet CEO Sundar Pichai’s support for a temporary ban on facial recognition technology in the European Union which Activists and technologists have called it racially biased, and voiced concerns about privacy, regarding its use by governments and law enforcement, the Metropolitan Police Service has today Friday, 24 January announced that it will begin the operational use of Live Facial Recognition (LFR) technology.
The technology is said to be deployed at key areas throughout the city. Using signposted cameras, the algorithms will scan the faces of people passing through its line of sight, alerting officers to potential matches with wanted criminals.
According to the Met, “this will help tackle serious crime, including serious violence, gun and knife crime, child sexual exploitation and help protect the vulnerable”.
In a tweet, the Met assured the public that any images obtained that don’t trigger a potential alert are deleted immediately — and that it’s up to officers whether they decide to stop someone based on an alert or not. The technology operates from a standalone system and isn’t linked to any other imaging platforms, such as CCTV or bodycams.
“This is an important development for the Met and one which is vital in assisting us in bearing down on violence. As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London. Independent research has shown that the public support us in this regard. Prior to deployment, we will be engaging with our partners and communities at a local level. Nick Ephgrave, Assistant Met Police Commissioner
“We are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point. Similar technology is already widely used across the UK, in the private sector. Ours has been trialed by our technology teams for use in an operational policing environment.
“Every day, our police officers are briefed about suspects they should look out for; LFR improves the effectiveness of this tactic.
“Similarly if it can help locate missing children or vulnerable adults swiftly, and keep them from harm and exploitation, then we have a duty to deploy the technology to do this.” he continues
Despite the Met’s insistence that the technology can be used for good, however, some critics have lambasted LFR as ineffectual and in some cases, unlawful. In April 2019, for example, a report from the University of Essex found that the Met’s LFR technology has an inaccuracy rate of 81 percent. The previous year, the technology used by police in South Wales mistakenly identified 2,300 innocent people as potential criminals.
The European facial recognition ban which Sundar Pichai backed was just being considered just last week which could stand for up to five years, while regulators figure out how to prevent the technology from being abused. Meanwhile, privacy campaign group Big Brother Watch — supported by more than 18 UK politicians and 25 additional campaign groups — has called for a halt to adoption, citing concerns about implementation without proper scrutiny.