The push for increased transparency following the financial crisis has had a visible impact on the financial services industry. Many regulations have created similar, but slightly different, requirements, in particular across the Atlantic. Increased – but uncoordinated – demand on data, and proof of process in different formats and languages, without proper impact assessments conducted beforehand, has resulted in one of the biggest costs for both regulators and regulated. With more of this to come, good data management is now essential.
In parallel, advances in big data technologies can be used to support the improvement of data management. One such major development is in-memory data storage – the readily available data that supports live queries and advanced analytics to provide the regulator or regulated with what they need on demand. For us to allow these new technologies to manage new regulatory requirements, we must first make this data readily and easy to exchange over multiple systems … and for that we need standards.
Data protection
The EU’s General Data Protection Regulation (GDPR) marks an essential move towards the free flow of personal data that bolsters efforts for global data standards, aiming to provide a single set of data protection rules for all 28 EU Member States when processing citizens’ personal data.
But now, standing in the way is the increased complexity of Britain’s recent vote to leave the EU. Even though GDPR comes into force on 25 May 2018 (before Britain leaves the EU), the UK Information Commission Office has announced that, if UK firms wish to trade with the EU in the same way as before Brexit, they will have to adhere to data protection standards as strict as GDPR for equivalence reasons.
The worry is of course that if the UK leave the single market it will lead to further disparities in data privacy standards. One lawyer has said that Brexit is a chance to have an equivalent GDPR, tweaked to be more practical and commercial, akin to a form of UK-EU privacy shield. The EU-US privacy shield is heralded symbolically as a step forward, yet the numerous disagreements causing the delay of its implementation remind us that this process is imperfect. Privacy is personal and cultural, and countries tend to go about it in their own way. With regulatory data demand set to increase exponentially, the industry cannot afford to be held back by traditional contractual mechanisms for much longer, and the free movement of data under harmonised privacy laws will soon be essential. If the UK was to stay in the EU, it would be able to influence GDPR in a way that makes everyone happy and subject to the same rules. Though this is no longer the case, in this uncertain future, it will be important that different regimes do not lose sight of the harmonised and collaborative goal.
Multijurisdictional data standards
A good global data management initiative requires standards for data privacy and also for regulatory data exchange. The industry needs a common reference model so that the right data fields are specified clearly for each regulation and the format for submission is consistent.
With regards to a common format for data submission, community initiatives such as the FIRE common data format are breaking new ground. FIRE defines a common specification for granular regulatory data and provides the technical documentation for all the fields required to undertake a specific business activity. The FIRE format runs hand in hand with HM Treasury’s open data initiative and the establishment of an open banking format. While open banking works on the data itself, making it visible to the pubic, FIRE works on a format for data exchange between regulator and regulated.
As we have previously argued, the end goal of these cross-border initiatives would be a cloud-based common reference model for regulation that regulators, firms and technologists alike could use to identify obligations in an understandable, universal language. It would allow firms to know what information is needed by each regulation and maintain one golden copy of data for submission on request, eliminating the need for continual reconciliation. The model would specify a common format for the easy exchange of regulatory data, thus providing greater transparency and accessibility to the right data for those who need it.
This one’s on the regulator
Many businesses are increasingly supplying solutions that focus on particular bits of specific regulation, but this is partly divergent from the regulators’ demand for firms to take good care of their data themselves. There is a balance to be struck between bespoke solutions and in-house data management.
But regulators must play their part too if the drive for innovation and standards is to come to fruition. Indeed, there is still a lack of coordination between regulators when designing and implementing policies, creating inconsistencies further down the line. What is needed is greater leadership from the executive committees of the regulators, concentrated on building an intelligent policy hub for common industry data modelling and the formation of common standards.
We can already see signs of movement here. At least six regulators are increasingly looking to the use of technology in compliance, for example EIOPA and EBA’s publication of data point models containing all the technical specifications necessary for developing IT reporting solutions. Regulators are collaborating too in pioneering initiatives, such as the Frankfurt Group, who are aiming to coordinate reporting data fields under different regulations to produce a common dictionary.
To make this initiative truly global, we believe regulators should facilitate the establishment of an international taskforce to develop the common reference model and data language for regulation. The taskforce should take a similar approach to the FSB taskforce for climate-related financial disclosure, and first set out to compare all the existing disclosure regimes and decide how they can be harmonised.
Established standard languages such as FIBO (Financial Industry Business Ontology) will play a critical role in facilitating the development of a regulatory dictionary for the financial services industry, and translators will be needed for local and regional language differences in regulatory terminology. Data standards such as ISO 20022, used increasingly for reporting, will also be integral to the common language development. Working together, cross-border barriers can be removed and the technology developed so as to create a harmonised regime across all jurisdictions.
RegTech Capital Markets Conference
Being able to reduce the barriers to effective data standards is essential to a good data management initiative that harnesses the use of new technologies. On 5 July, our panel* will bring experts together to discuss existing harmonisation efforts and the steps and current barriers to establishing multijurisdictional data standards.
Panellists confirmed so far include Mike Bennett (Semantics Repository Lead, EDM Council), Howard Guess (Data Architect, Deutsche Bank), Arjun Singh-Muchelle (Senior Advisor, Capital Markets, The Investment Association), David Masters (Director – Head of Operations Regulatory Reporting Production, London, Société Générale) and Hugh Daly (CEO, Message Automation). Be sure to sign up, and read more on the agenda, here.
* Panel: Dismantling data standards barriers
- Why is there an increasing need for better coordination between regulators?
- What efforts to establish global initiatives for harmonisation currently exist?
- What would be the success criteria for a good global data management initiative?