Science

Appeals Court finds police’s use of facial recognition technology “unlawful”

Source: fungaifoto (via Pixabay)
Following a recent appeal heard by London’s Court of Appeals, South Wales Police’s use of Automatic Facial Recognition (AFR) technology has been ruled as “unlawful”.

By Jack Robert Stacey

Following a recent appeal heard by London’s Court of Appeals, South Wales Police’s use of Automatic Facial Recognition (AFR) technology has been ruled as “unlawful”.

The Court’s ruling comes after a recent legal challenge proposed by Ed Bridges, a Public Affairs Manager for Cardiff University, who contested that the technology had identified and analysed his face without his consent. This was the world’s first legal challenge against police use of facial recognition technology and was propounded by Liberty, a UK-based civil rights advocacy group.

In the remote hearing, the Court said that South Wales Police had taken insufficient steps to provide guidance on the software’s operation and failed to conduct an extensive investigation in identifying biases within the system.

It also, however, highlighted that the advantages of an accurate and responsive facial recognition system outweigh the discomfort felt by Mr Bridges.

South Wales Police began a Home Office-sponsored trial of their current iteration of  automated facial recognition (AFR) technology back in 2017 and “remains completely committed” to developing the software’s overall accuracy, attesting that the technology had aided in the arrests of 61 people for a multitude of offences.

Jeremy Vaughan, a Deputy Chief Constable for South Wales Police and national policing lead for facial recognition, said that:

“This judgement will only strengthen the work which is already underway to ensure that the operational policies we have in place can withstand robust legal challenge and public scrutiny.”

Automatic Facial Recognition (AFR) technology is commonly installed in highly-populated, urban areas where cameras are able to independently identify and distinguish the faces within a sizeable crowd. Any captured images are compared against SWP’s police database of approximately 500,000 suspects and persons of interest which, providing that there is a significant-level of similarity between the images, causes the software to alert nearby police officers.

Crucially, most modern forms of facial recognition technology employ ‘deep learning’ to reduce the length of time it takes for the system to accurately identify human faces. As the system continues to compare captured images against its database overtime, the facial recognition software itself improves as it finds success and failure over time.

This however has become a pivotal point of contention between advocates and challengers of surveillance as this type of ‘deep learning’ requires access to large sets of data, much of which is comprised of incorrect identifications that may be stored for up to several weeks after capture.

In reference to the Court’s ruling, Megan Goulding, a lawyer for Liberty, asserted that:

“It is time for the Government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned.”

The human rights advocacy group Liberty, acting on behalf of the interests of Mr Bridges, also established the ‘Resist Facial Recognition’ petition which prompts the UK government to act against “intimidating and intrusive” technologies and has collected almost 50,000 signatures.

Liberty’s petition highlights San Francisco’s historic, city-wide ban on the use of facial recognition and contends that the UK “must follow their lead”.

With many international committees and activist groups calling for a reduced surveillance and greater level of transparency in the use of collected data, the Court’s verdict could significantly alter the UK’s future operation and development of AFR technology.

Add Comment

Click here to post a comment

Your email address will not be published. Required fields are marked *

css.php