CONTACT US 0131 226 8200

Latest Blogs

Facial Recognition Technology

The new data protection regime has created unique concerns about the implementation of cutting-edge technology in surveillance and policing. The High Court recently heard a case dealing with live facial recognition technology used by South Wales Police (SWP).

Ed Bridges of Cardiff, Wales, (with the assistance of human rights organisation, Liberty) filed a lawsuit against the SWP, alleging that his rights had been violated after they used facial recognition technology. He asserted that he had been recorded without his permission at least twice while he was using public thoroughfares in Cardiff (once for shopping and once for attending a political rally). While the use of facial scanning technology in law enforcement is not new, this particular case is unique because it focused on technologies which can scan human faces in real time, as opposed to using old photos like driver’s licenses or passports. The High Court dismissed this case in what will likely be seen as a boon to law enforcement agencies.

Currently, facial recognition technology has a limited presence in police departments in the UK, used only by the SWP and the Metropolitan Police Service in London. The SWP used it primarily in connection to large events, like rugby and football matches, pop concerts, and festivals but had deployed it on at least 50 occasions at the time of the release of the High Court’s opinion.

Typically, a CCTV camera will scan faces in the crowd, which will extract facial biometric information in real time. This information will be compared to the biometric information of people who are on a specific watch list, or are wanted by police, created from photos held on a database maintained by SWP.

Among other claims, Mr. Bridges asserted that use of the technology was a violation of the Data Protection Act 2018. Despite the fact that the incidents in question occurred prior to implementation to the DPA 2018, both parties requested that the court interpret the case as though the law was effective at the time of the events.

The court first determined that use of the technology entailed sensitive processing of members of the public, and interfered with privacy rights. Therefore, the next question was whether SWP’s internal processes complied with the Data Protection Act 2018. The processing of personal data by law enforcement must adhere to six data protection principles. First, it must be lawful and fair. Next, the processing of sensitive data only be done where the data subject has consented or where the processing is strictly necessary for the law enforcement purpose. In each case, the authority must have an appropriate policy document in place, which sets forth internal policies and procedures concerning the processing of sensitive data. SWP was also transparent and used appropriate signage each time the technology was applied.

The court found that there was a lawful basis for the processing of data, furthering SWP’s legitimate interests to prevent and detect crime. And although it interfered with the privacy rights of citizens, the process was subject to sufficient legal controls, including the DPA, and the SWP’s own published policies. Factors that helped the court reach this conclusion were that each time the technology was used, it was for a limited time and in a specific location, and all data relating to someone was deleted immediately after it was processed, unless there was a match.

Notably, the use of the technology is still being trialled by SWP, and the court recognised the need to periodically review any circumstances in which recognition software is used. And, in spite of the court’s reassurance from the SWP that all data is deleted ‘unless there is a match,’ the accuracy of the technology has come under fire. One study by the University of Essex found that the Met in London correctly matched just under 20 percent of their reported matches – the rest were all erroneous. This means that 4 out of 5 individuals were wrongly identified (and their data stayed on the system).

The debate on the use of live facial recognition technology will no doubt continue as the case can be appealed. The technology has already been found to interfere with rights to privacy, and is rife for abuse without strict regulations in place. Indeed, the High Court recognised this when it said “the law seeks to strike a sensible balance between the protection of private rights, on the one hand, and the public interest in harnessing new technologies to aid the detection and prevention of crime, on the other.” A spokesperson for the ICO recognised this conflict as well, cautioning the police or anyone else who wants to use this technology that “existing data protection law and guidance still apply.” But a precedent has been set, and for now, the use of such technology will seems geared to proliferate with police departments nationwide.

Mortgage Repossession - the uncertain consequences...
MBM Commercial's Tracey Ginn finalist in the Scott...

Contact us today