RegTech Intelligence


Article
EU rules governing artificial intelligence will put compliance obligations on facial recognition RegTech

Rachel Wolcott, Regulatory Intelligence

New European Union rules governing artificial intelligence (AI) will put compliance obligations on automated facial recognition (AFR) used in some regtech applications, particularly client risk screening. UK data privacy and biometrics regulators are also seeking to improve employee monitoring and surveillance camera operation guidance to clarify compliance obligations under local data privacy laws. These efforts, in conjunction with existing data privacy laws, could prompt firms to reduce or even eliminate facial recognition technology’s compliance and workplace applications.

Financial institutions use facial recognition in compliance applications including for anti-money laundering (AML) and know-your- customer (KYC) risk screening, communications recordkeeping and video conferencing surveillance, as well as to monitor employees.

These legislative and regulatory efforts will not result in a total ban on facial recognition, but some regulators would like to see private and commercial AFR use banned or conducted under licence only. The technology is liable to serious misuse and has the potential for societal detriment, said Fraser Sampson, the UK’s surveillance camera commissioner.

“There are potentially some developments in this area which are so ethically fraught, that instead of the system waiting for something that’s been done, and then tries to show that it was wrong. You do it the other way and say ‘no the only the only circumstances under which you will be able to do this lawfully is if you are licensed to do so’,” Sampson told Regulatory Intelligence.

The AI Act

The European Union’s draft Artificial Intelligence Act ( AI Act) will ban most AFR use by law enforcement and will deem regtech applications such as KYC/AML risk screening to be biometric identification and categorisation of natural persons products, and therefore high risk. That means these AI-powered compliance tools like KYC/AML risk screening will themselves be subject to a range of compliance obligations — data and data governance, transparency for users, human oversight, accuracy, robustness and cyber security, as well as traceability and auditability.

Financial institutions use facial recognition for identity verification in client onboarding KYC processes. This use would generally be deemed “limited risk” if an onboarding system checks whether a customer’s submitted picture matches the one used on the

identification document (such as a passport photo) they supplied. There would still be some transparency requirements and disclosure obligations under the AI Act.

“With regards to the draft AI Act, there are different levels of risks. There are systems that are restricted, for example, real-time biometric identification, which is prohibited. Then you have the high-risk systems, which would be biometric identification. But if you’re a bank, and you’re conducting KYC processes with regards to verification, then that will not fall under high risk, which is why the banks will stay away from identification. AFR in the banks … is not going to be coming any time soon,” said Jimmy Orucevic, a privacy professional in KPMG’s cyber information management and compliance practice in Zurich.

Some AML/KYC vendors are marketing facial recognition systems which check customers against a private database of “unnamed people of interest” as an alternative to name-based screening. Vendors say this technology allows firms to identify emerging threats posed by unnamed persons of interest not covered by traditional watchlists. This application looks like identification, not verification, which would land it in the high-risk category.

“I don’t see how banks, especially in Switzerland, would ever touch a tool like that, but I haven’t seen it yet. Even under the [General Data Protection Regulation] or under the Swiss Data Protection Federal Act, there will be a lot of obligations coming with implementing such a tool,” Orucevic said.

Members of the European Parliament this week asked for a permanent ban on the automated recognition of individuals in public spaces, noting that citizens should only be monitored when suspected of a crime. MEPs said the “use of private facial recognition databases (such as the Clearview AI system) and predictive policing based on behavioural data” should be forbidden.

Commercial use of such private facial recognition databases could come under further scrutiny. MEPs will consider high-risk categories, types of technology, and their applications as they prepare their position on the AI Act, a European Parliament spokesperson for civil liberties, justice and home affairs told Regulatory Intelligence.

AFR for MiFID recordkeeping

 

Some firms now deploy Markets in Financial Instruments Directive (MiFID II) recordkeeping tools that capture and store video with the ability to use AI and AFR to search for misconduct. Vendors are also marketing security systems that use AFR to monitor employees’ on-camera conduct and content-sharing, and intervene in real time. This kind of application will likely be designated high-risk once the AI Act is passed.

“The biggest problem is when you have the MiFID regulations, you need to record every call. That’s why a lot of major banks don’t allow video calls. If you do allow videos, then you need some kind of artificial intelligence to sort out the irrelevant videos from the relevant ones. The question is, what are you using there to look for voice or a picture? If this is using artificial intelligence or facial recognition, you must be doing it right — otherwise you will run into problems with the law,” said Alberto Job, director, information management and compliance, at KPMG in Zurich.

Behavioural monitoring and surveillance aimed at measuring productivity — such as keystroke tracking — is already forbidden in Europe, Job said.

The AI Act is at the beginning of the legislative process, meaning it could be five years before regulations apply.

Control, not compliance

The use of employee surveillance technology, including AFR, grew during the pandemic, with many financial institutions insisting employees working from home were “on camera” throughout their working day. Firms cited compliance and productivity as reasons for heightened surveillance.

Firms are using surveillance technology to control employees working from home, rather than for compliance purposes, said J.S. Nelson, a visiting associate professor at Harvard Business School who specialises in management practices, compliance and surveillance.

“What’s going on is, it’s not just compliance. Compliance I think of as enforcing legal standards. It’s a baseline. What you have here is compliance as a cover for these arguments about the productivity and efficiency of workers. It’s a management argument. That’s what’s really going on. That’s that issue of control. That issue of ‘I feel out of control’, because the very privileged few who are able to work from home, they’ve disappeared from the office,” she said.

Surveillance programmes appeal to managers’ sense of control and being on top of what is happening with employees and their productivity, Nelson said.

UK ICO to enhance employee surveillance guidance

The UK Information Commissioner’s Office (ICO) is reviewing its surveillance guidance following more widespread adoption during the pandemic not only to monitor for employee illness, but also, primarily, to watch employees working from home.

“There’s no way to separate your work life from your personal life, especially when the workplace is your home and you’re being surveilled constantly. It’s really insidious. You have all this pushing of boundaries — which is the place in which you’re surveilled, the times at which you’re surveilled, and the purposes for which you’re surveilled. We have this capacity to get all this data and save it all up, but haven’t had the conversation that we really need to about why this stuff is collected, for what purpose and what the limit should be of it? Then there’s the question of how it’s experienced, because it’s changing the nature of work and changing our experience of the workplace itself,” Nelson said.

The ICO already has some guidance on surveillance camera use, saying employees may not always expect to be monitored via video surveillance systems in their day-to-day roles. Employers should consider any less privacy-intrusive ways to achieve the same result and make an assessment of its necessity and proportionality.

Many financial services firms view surveillance of all kinds as part of the terms of employment. “Work here and submit to surveillance for compliance purposes” is the common explanation. That changes in the home environment when AFR is used.

“There are two broad approaches you can take whereby you say everything is in scope except for those areas that you argue ought not to be. Or you can begin by saying, ‘no this is enormously intrusive and therefore the starting point is you can’t do any of this unless each aspect has a compelling and cogent reason behind it’. The approach you take is determined by the imbalance of power. If I’m an employee, how much influence do I have on which things I will consent to readily or at all?” Sampson said.

UK surveillance camera code

The UK government is consulting on new guidance for surveillance camera operation. It will apply to surveillance cameras used by public bodies and law enforcement, including AFR. Nevertheless, private and commercial AFR users should take note to avoid data privacy breaches, and follow the data collection minimisation principles.

Cameras built into a work laptop and connected to an employer-monitored system are surveillance cameras, Sampson said. Sometimes these cameras link to facial recognition software which require employees to log in continuously, to confirm they are at their desks. Firms operating surveillance camera systems inside employees’ homes risk capturing images of employees’ children, for example, which would constitute a data privacy breach.

 

Last year the Court of Appeal, in Bridges v South Wales Police, found South Wales Police’s use of facial recognition technology breached privacy rights, data protection laws and equality laws.

There is a read-across for private or commercial AFR users conducting surveillance including employee monitoring, Sampson said.

“Looking at Bridges as a lawyer, my view remains that it is by and large a data protection principles case that emerged in the context of a surveillance camera. Very little that took place in terms of the argued points in Bridges were solely related to surveillance camera use by the police. Those points were and are applicable to most other settings and are of general application. In the same way that some of the other employee monitoring cases that have come from Strasbourg apply the principles, and then say, within the setting of an employment arrangement, this is is how they should be interpreted. But there’s still general central tenets of people’s rights when they live in a mature and tolerant democracy,” he said.

This article was originally published by Thomson Reuters Regulatory Intelligence on 8 October 2021

 

Register Here

Additional channels:

  • Access the JWG Digital Integrity LinkedIn here  AML & Client management LinkedIn here 
  • To create your own JWG RegTech Intelligence Hub, sign up here
  • To register for JWG’s 16/17 November 2021 conference, see here

To promote global dialogue on how to deliver regulatory change JWG post hundreds of focused articles a year to thousands of subscribers. Get involved and join the mail list.

By hitting the subscribe button you agree to our Privacy Policy

Latest
Unwrapping DORA

December 10, 2024 - In: Analysis

Bridging DORA Gaps 2025

November 25, 2024 - In: Analysis

Supplier countdown DORA: T-40

November 25, 2024 - In: Analysis

DeFi RegTech Opportunities: 2025

October 25, 2024 - In: Analysis

Scaling OpRes Mountain: The New Risk Frontier

October 22, 2024 - In: Analysis

Navigating OpRes Storms in 2025

October 9, 2024 - In: Analysis

Navigating OpRes with RegTech

October 6, 2024 - In: Analysis