By PJ Di Giammarino, CEO JWG and Chair RegTech Council
- Regulators are being hampered in their risk oversight duties by poor data quality and over £100m in fines were issued in 2019 for poor reports
- EU and UK regulators are out in front of global efforts to correct the rocky start on the road to a transparent market
- The FS sector should be elevating the debate about how to value data as capital in order to obtain the resources to manage it properly now
- You are invited to discuss strategies for moving forwards with the BoE, FCA and European Commission on 7 Feb in London
In part one of this analysis we look at the state of regulatory reporting and how RegTech will help resolve a difficult data quality challenge.
A rocky start in Pittsburgh
In September 2009 the G20 leaders’ statement committed global regulators to very complex rules within a very short space of time, without the benefit of new technology.
At the time, JWG commented: “Despite calls from industry trade associations for the G20 to conduct a cumulative impact assessment before agreeing a path forward, the world’s politicians have asked the newly minted Financial Stability Board (FSB) to get on with a regulatory ‘race to the top’, which pits rule makers against each other and the banks, in a bid to build a macro view of systemic risk.”
We called for detailed technical policies and procedure manuals to help the industry create a holistic picture of a market, counterparty, customer or instrument to meet its systemic risk objectives. Having already spent a few years helping the industry interpret regulations like MiFID I and AMLD III, we knew that while this exercise sounded like a technology problem, it was actually an enormous legal, commercial and policy challenge.
We predicted that “The G20 has just set a real data trap for 2011 – who will be to blame when we fail to hit the deadlines with controls that meet the political objectives? Likewise, when the true cost to the real economy of these initiatives is finally revealed, how is the taxpayer going to react?”
Correcting the course across the channel and pond
So what happened by 2011? Not much! As ever, we were a bit ahead of the curve. However, in the past two years we have helped with root cause analysis, cost/ benefit models, technical proofs of concept and devising new methods for collaboration.
Two big changes have helped over the past couple of years: 1) we have started to see some serious fines for poor reporting data and 2) regulators and firms have gotten serious about Digital Regulatory Reporting RegTech.
In the UK alone we have saw over £100m in fines last year. The PRA and FCA have dialled up the pressure on data quality. UBS and Goldman Sachs were fined a total of £61.9m in Q119 for data quality issues while Citi was fined £44m for inadequate systems controls in terms of reporting, inadequate staffing levels, poor governance and an insufficiently robust approach resulting in errors in the bank’s regulatory capital and liquidity.
The second big shift has been prompted by a tremendous body of work has been undertaken to look at how very detailed information requirements for ambiguous concepts that have slightly different meanings for each segment of the market can be addressed by RegTech.
In Europe, the European Commission DG FSMA have published a 2+ year Fitness Check on the results from a top to bottom review of reporting and concluded that that a common database of granular data and a repository of the interpretation of reporting rules in a format that is readable by computers needs to be considered. It also noted that the post-crisis legislation did not systematically set out the objectives of reporting or assessed its impacts. Thankfully, it also concluded that there is scope to improve the consultation process and level of collaboration with SMEs from the firms – and with reasonable time scales.
In December 2019 the Expert Group has been established by the European Commission, in accordance with its 2018 FinTech Action Plan (ROFIEG) released 30 recommendations. Many dealt with reporting issues including the need for standardisation of legal terminology and classification of actors, services, products and processes; development of human- and machine-readable legal and regulatory language; a Regulatory Clearing House to centralise the automated dissemination of rules to regulated entities, collecting information from entities and collecting market data.
Perhaps emboldened by the Commission, ESMA (10/19), AMF (12/19) and EBA (01/20) have both recently published opinions which note that there is significant room for improvement on data quality. ESMA has also recently announced their vision to become the data-hub for EU markets.
The Bank of England’s 01/20 discussion paper, ” Transforming data collection from the UK financial sector”, and the ” Digital Regulatory Reporting Phase 2 Viability Assessment” jointly published with the Financial Conduct Authority and participating banks, said the business case for digital reporting and transformed data collection is incomplete and a clearer picture of the benefits is required. However, the FCA’s data strategy, published on the same day clearly indicates that DRR is part of the plan moving forwards.
In summary, according to analysis by Thomson Reuters, UK regulators will push ahead with data collection transformation and DRR regardless. Both papers set out next steps, consultations and continued piloting. This engagement puts the UK alongside its international peers including the U.S. Consumer Financial Protection Bureau (CFPB) and Commodity Futures Trading Commission (CFTC), the Monetary Authority of Singapore (MAS), the Hong Kong Monetary Authority (HKMA), the Japan Financial Services Agency (JFSA), and the Philippines Central Bank (BSP).
Preparing the path forward: framing the business case
All these policy efforts, taken together emphasise the need for a holistic transformation strategy, roadmap and governance arrangements. As with any planned voyage, it now behoves all actors in FS to make sure we know the nature of travel and whether we can fund the crossing.
This is not a small issue, information on the market and our prudential returns form the bedrock of risk management in the industry. Millions of data points need to get better aligned but changing the way the data is exchanged will cost real money. The cost of 100% data quality is infinite and, as anyone involved in BCBS 239 programmes knows, take consume tremendous management overhead. Where is the incentive to commit the resources required to improve data quality?
We have long held that the answer lies beyond the fines. It is not enough to rely on the stick to enforce the right behaviours. A new incentive mechanism for the firms that is missing from the current regulatory framework – a mechanism for evaluating the computational integrity of the data.
We believe the time is right to elevate the debate about how to value ‘data as capital.’ If tens of thousands of actors are to work together, we need a framework to value the information exchange that will lead to the assignment of the appropriate resources to provide high quality information.
Joining in the debate
If you are dealing with reporting or data issues, please let us know if you’d like to register for our exciting programme of events and projects this year.
On 7 February at our RegTech 2.0 conference in London we will have a rich debate between public and private sectors about how to win in the decade ahead.
If you would like to join over 50 regulatory, banking and technology institutions already registered including the Bank of England, European Commission, FCA, please register here: http://regtechconference.co.uk/ or contact Corrina.firstname.lastname@example.org. Entry is free for regulators and people working at financial institutions.