lawyermonthly 1100x100 oct2024eb sj lawyermonthly 800x90 dalyblack (1)

Is Facial Recognition a Violation of our Privacy?

In this Article
Reading Time:
5
 minutes
Posted: 3rd July 2019 by
Jaya Harrar
Last updated 4th July 2019
Share this article

Like any other technological advancement, facial recognition technology (FRT) has its benefits. Whilst there are clear advantages to such developments that seem almost too futuristic for 2019, there is usually a caveat at hand with an array of TBD answers to much pressing questions prior to it being fully adopted and welcomed into civilisation. Especially, if it is in the hands of authority.

And FRT is no exception. In fact, it has barely yet made its national debut into society before someone decided to legally challenge its use. Three major police forces in the UK - Leicestershire Police, Metropolitan Police and South Wales Police, have actually been trialling the tech out for a few years now, but it hasn't been without criticism. Now, it is under scrutiny in court for breaching human rights. But, why?

Why are the police using FRT?

 Some fear FRT is the first phase of pushing the US to a Big Brother era, enforcing a more oppressive surveillance state.

FRT sounds like every Crimewatcher’s dream. This piece of tech is supposedly more effective than CCTV as the camera creates a biometric map of the face it has captured, creating a unique code which distinguishes that particular face. In short, police can use these images of passers-by to see if they match anyone on their watch lists. Handy, huh? Merely understanding the basic form of how this sophisticated tech works almost makes you feel so much safer. The tool allows the police force to scan busy crowds in shopping centres or concerts to omit risk and potential danger.

And many agree. Speaking to the BBC, Chris Phillips, Former Head of the National Counter Terrorism Security Office, said: "If there are hundreds of people walking the streets who should be in prison because there are outstanding warrants for their arrest, or dangerous criminals bent on harming others in public places, the proper use of AFR [automated facial recognition] has a vital policing role."

However, Ed Bridges, an office worker simply trying to peacefully purchase a sandwich for lunch has questioned this and states that it is a violation of privacy, a breach of human rights, data protection and equality laws.

When the MET police trialled the above in London, not everyone was happy; three arrests were made alone on one day alone

Does FRT violate our privacy?

Some would strongly argue so. San Francisco for starters: earlier last month the hub for tech revolution became the first city to ban the use of FRT by police and other agencies. Why? Some fear FRT is the first phase of pushing the US to a Big Brother era, enforcing a more oppressive surveillance state.

However, other states have witnessed the advantages of facial recognition during threatening events. In Annapolis Maryland when a suspect refused to cooperate with police and be identified with their fingerprints, FRT came into play. With Maryland being one of the most aggressive states when it comes to facial recognition, it is no surprise they opted for it to help them with their investigation. As Anne Arundel County Police Chief Timothy Altomare said: “We would have been much longer in identifying him and being able to push forward in the investigation without that system."

But after UK’s Ed Bridges decided to seek legal action, Megan Goulding, a lawyer from the civil liberties group Liberty which is supporting Bridges in his claim, stated that using FRT is: "just like taking people's DNA or fingerprints, without their knowledge or their consent.”.

In order to scan the suspect’s face in Maryland, the police needed to use their ‘exclusive’ access to “three million state mug shots, seven million state driver’s license photos and an additional 24.9 million mug shots from a national FBI database”, and it is not clear to how many other people or agencies have access to this same information.

It is now for police and parliamentarians to face up to the facts: facial recognition represents an inherent risk to our rights, and has no place on our streets.

If we were to push aside our worry that anyone could potentially hack or scam their way in to gain access to such data, we must at least give our time to question how we feel about scanners taking our photos when we are merely going for a stroll. When the MET police trialled the above in London, not everyone was happy; three arrests were made alone on one day alone, and one man was fined £90 for a public order offence, after arguing his right to cover his face in order to avoid the cameras. When does it switch from trying to maintain your personal privacy, to refusing to cooperate with authorities? It is a fine line in play here.

And privacy is one aspect, but Liberty states, and we can argue that the aforementioned gentleman would probably agree, that the tech breaches our freedom of expression. They released a statement, saying:

“Facial recognition technology captures the biometric data of everyone who passes the cameras, violating our right to privacy and undermining our freedom of expression.”

Goulding added to this, saying: “Facial recognition is an inherently intrusive technology that breaches our privacy rights. It risks fundamentally altering our public spaces, forcing us to monitor where we go and who with, seriously undermining our freedom of expression… It is now for police and parliamentarians to face up to the facts: facial recognition represents an inherent risk to our rights, and has no place on our streets.”

Freedom of expression aside, (yes, there is still more factors to ponder), Liberty also bring to light another pressing issue:

The technology also discriminates against women and people of colour – it disproportionately misidentifies those people, making them more likely to be subject to a police stop due to an incorrect match.”

In fact, the BBC reported that Black and minority ethnic people could be falsely identified and face questioning, due to the fact that the police have failed to test how well their systems deal with non-white faces after missing chances to test how well their systems deal with such situations.

With the risk of there being a bias towards certain ethnicities and races, there is potential for false matches such as these changing society and the nature of public spaces.

From freedom of expression to discrimination, the matter at hand remains: is FRT well regulated? In short, no.

Adding to this, speaking to Julian Hayes, Partner at BCL Solicitors, he mentions how at the 2017 Notting Hill Carnival, FRT was wrong 98% of the time, risking misidentification and miscarriages of justice. He said: “To mitigate the risks posed by such tools, it’s important we fully consider the implications of their deployment by the authorities.”

From freedom of expression to discrimination, the matter at hand remains: is FRT well regulated? In short, no. Lecturer in law, Dr Purshouse argues that Parliament should set out rules governing the scope of the power of the police to deploy FRT surveillance in public spaces to ensure consistency across police forces. He says “As it currently stands, police forces trialling FRT are left to come up with divergent, and sometimes troubling, policies and practices for the execution of their FRT operations[1]."

However, if the criminal justice system is to retain the trust and confidence of society, it’s essential that we appreciate the limitations of AI in its various guises, from FRT to futuristic recidivism forecasting.

With there being no legal framework,  there is little stopping police forces from taking images from the internet or social media accounts to populate their 'watch lists'.

As Julian expands, “Algorithmic policing, with its efficiency and cost-saving potential, is undoubtedly here to stay, and if it prevents crime and assists in apprehending offenders, most people would encourage its use by law enforcement. However, if the criminal justice system is to retain the trust and confidence of society, it’s essential that we appreciate the limitations of AI in its various guises, from FRT to futuristic recidivism forecasting. The privacy implications are significant and the technology is not infallible; it must not be seen as a panacea.”

In essence, policy is the problem.  There is a fine line between security and infringing citizens’ privacy if there is not any legislation stating what is and is not acceptable use of FRT. We can only wait to see what the court decides to those legally challenging its use and if the government outlines the policies regarding FRT and its public usage in the upcoming years.

 

[1] https://www.openaccessgovernment.org/police-facial-recognition-technology/60775/

 

Sign up to our newsletter for the latest Legal Advice Updates
Subscribe to Lawyer Monthly Magazine Today to receive all of the latest news from the world of Law.

About Lawyer Monthly

Lawyer Monthly is a news website and monthly legal publication with content that is entirely defined by the significant legal news from around the world.