Facial Recognition and GDPR

By Andrey Koptelov

The advent of facial recognition technology (FRT) has introduced a range of exciting possibilities to enhance cybersecurity and public safety. However, the EU General Data Protection Regulation (GDPR) calls for increased attention to compliance among both governmental and private organizations. Implementing computer vision and facial recognition projects, Itransition closely monitors GDRP compliance requirements in regard to the use of FRT. Let’s take a look at those closely.

FRT and the GDPR

The data collected by FRT is classified as biometric data, which is special-category data that is prohibited to be processed under the GDPR.

Unlike CCTV footage, which can’t be accessed without permission, the use of FRT implies automatic biometric data processing. However, biometric data may be processed under the following conditions: 

  • The data subject has given explicit consent to the personal data processing.
  • The data processing is necessary for reasons of significant public interest. 

This implies that the use of FRT is only possible when legal consent is obtained in accordance with the GDPR. Companies looking to use FRT should establish definitive legal grounds prior to the technology implementation. 

However, as with many other regulations concerning data privacy, the use of FRT is plagued with many intricacies. For example, with the FRT going mainstream, some retail companies thought that displaying a warning sign about the use of FRT before the store entrance is sufficient. In Article 9(2)(e) of the GDPR, it is explicitly stated that entering into the range of the camera doesn’t imply that consent to personal data processing was given.

How It Shouldn’t Be Done

In 2019, the Swedish Data Protection Authority (DPA) fined a school for violating the GDPR guidelines even though the consent was obtained. To save thousands of hours that teachers spend on attendance reporting, the Swedish school has decided to run a pilot program that involved the use of FRT to track the attendance of 22 students. Interestingly enough, it was local media reports that drew the attention of Swedish authorities. After an investigation, the DPA decided to impose a $20,700 fine on the school. This story reveals a number of important factors that any business or governmental body looking to implement FRT has to consider. 

The first and probably most significant issue, in this case, is that the Swedish school didn’t obtain regulatory approval before starting the trial. Perhaps, because of securing parents’ consent, the school assumed that regulatory approval is not needed, especially for such a small-scale project. However, for organizations to stay compliant with the GDPR, despite the scale and the intent of FRT implementation, respective regulatory bodies have to be consulted regardless of the project’s scope and scale. 

Second, the school breached the fundamental principle of data processing under the GDPR. In this case, the DPA concluded that the use of FRT to track pupils’ attendance was disproportionate. In other words, processing personal data for the sake of mere process optimization would most likely have been considered irrelevant by the DPA.

Although the school has reportedly conducted risk analysis, the assessment has been considered insufficient in regards to personal data protection and the proportionality of the collected data and the purpose of its use. This point highlights the importance of conducting thorough impact assessment prior to requesting consultation with a respective authority. 

How It Should Be Done 

Brøndby IF, a Danish football team, has successfully gained approval from its local DPA for the use of FRT to detect persons that have been banned from attending football matches for violation of the club’s rules. This case has gained significant media attention because Brøndby IF is the first private company to obtain official approval for the use of FRT in Denmark. 

The company has managed to secure approval based on Article 9(2)(g) of the GDPR, which allows processing sensitive data for reasons of public interest. Unsurprisingly, the football’s club security chief, Mickel Lauritsen, states that getting approval was a long and cumbersome process. About five years prior to the approval, the stadium’s safeguards were allowed only to use the wrongdoers’ descriptions to pick them out of the crowd. After the DPA has allowed using photos, the company began an almost three-year-long negotiation involving the DPA, team management and lawyers, and the technology provider Panasonic. 

Even after successful deployment, Lauritsen reports that the use of the system is closely monitored to ensure that it doesn’t breach the established boundaries. For example, the system is prohibited from internet connectivity, and the photos of the people on the watchlist have to be uploaded only on the game day and deleted right afterward.  

The Ethical Dilemma

Any use of FRT will inevitably be met with ranging degrees of skepticism. The main reason is that some principles outlined in the GDPR are often vaguely defined and can be interpreted from different angles. In the case of the Danish football team, for example, it can be argued that their use of FRT does constitute public interest.

Jesper Lund, the chairman of the IT-Political Association of Denmark, argued that ensuring that certain persons on a banned list don’t go to a football match is not proportionate to the invasion of personal privacy. Furthermore, he argues that the FRT is too unreliable and inaccurate for identifying persons in large crowds. The Big Brother Watch’s 2018 report justifies this claim by stating that a staggering 95% of ‘matches’ are false positives.

The factual accuracy of Panasonic’s system can’t currently be confirmed, although Lauritsen states that the system has managed to successfully identify four persons in a 10-month period. Again, can we consider that gathering biometric data of 14,000 people on the game day is justifiable for picking out four troublemakers a year? Most importantly, who and how exactly can measure that? 

There is no doubt about the effectiveness of FRT and other AI-based technologies. The future where FRT is widespread seems inevitable. For now, it’s critical to understand that there are no right or wrong answers here. When it comes to personal data processing, the conversation between data controllers and legal authorities is paramount. Given FRT’s immaturity at the moment, we are going to have what looks like a years-long road ahead of figuring how to use this technology effectively from the legal standpoint.

About the Author

Andrey Koptelov

Andrey Koptelov is an Innovation Analyst at Itransition, a custom software development company headquartered in Denver. With a profound experience in IT, he writes about new disruptive technologies and innovations in artificial intelligence and machine learning.

LEAVE A REPLY

Please enter your comment!
Please enter your name here