RegTech Intelligence


Article
Shining a light down the data management tunnel at FIMA Europe 2015

Getting your head round the main objectives of the current financial reform agenda is a task in itself, but when it comes down to the increasing data management requirements that follow hand in hand, it’s all about the nuts and bolts. Whose responsibility is it to manage this data, and keep it up to quality? Is it possible to find some level of order in this regulatory storm, some standard bearing requirement or model that holds it all together?

Senior data professionals came together for a VIP lunch on 10 November at the Financial Information Management conference (FIMA 2015) to examine these issues. The lunch and conversation was organised by JWG Group, the think-tank leading the collaborative implementation dialogue across regulators and firms, and Trillium Software, the world’s leading provider of data quality solutions enabling watertight regulatory compliance. In combination, the two organisations fostered a professional insight into the convergence of IT and financial reform. With JWG leading the discussion, a number of key points emerged from the dialogue.

Know thy Chief Data Officer

CDOs neatly bridge the gap between regulatory requirements and business requirements – they interpret both sides of the agenda and manage the data accordingly. Perhaps most essentially, they standardise definitions. The longstanding lack of communication between business departments and regulatory departments is resolved through the CDO’s creation of data models that are compatible across the business.

But this is a lot of responsibility. Time was taken at the event to debate where the line should be drawn between governance and operations in a CDO’s job role, and the conclusion was that this will always be specific to the firm in question.

How will legacy play into strategy?

The way firms developed their data structures over the course of time to become what they are today is often overlooked. London is a sprawling metropolitan city that was built through historically unplanned expansion, and is thus a lot harder to navigate than its cousin across the Atlantic in the numerically ordered streets of New York. The legacy of a firm and its structural development follow a similar pattern, particularly in firms that have experienced mergers and acquisitions and now know – all too well – that working the different data management structures into their own can be a real headache. On top of the accumulation of new attributes under increasing regulations, it simply gets harder and harder to be proactive, so data management structures simply expand – under little planned supervision.

How do we solve that? Harness the power of the CDO to spot the similarities and differences between a firm’s disparate regulatory programmes, perhaps with the help of an external consultant, to map out and pursue the activities that are still profitable under the new regulation-hit business model. This was the winning solution according to at least one attendee.

Working towards a master taxonomy

And, on that note, the path towards a standardised, unified data model is interrupted by a fallen tree – senior executives. Yet again though, the audience had an example of how they got around this. To get backing from senior executives for a unified model, one participant set up an executive steering panel of sixteen members around regulatory data projects. This helped to bridge the communication gap and to drive standardisation further down the road.

Another participant described how his data department brings technical teams together with the business to standardise definitions that are then fed into a builder, simultaneously ensuring that the data model remains exactly the same. Once definitions are compiled it becomes evident just how many are related – the bank in question reduced the amount of rules that had to be quoted to a new product it launched to just 1% of the actual amount that applied to it.

And finally, should the onus be on the vendors or the regulators to standardise data?

With participants disagreeing over the role of vendors for standardising data to support the banks, eyes turned to the regulator. Yet it was called to attention that the regulatory storm is not over yet, that further documents will come out followed by further variables and so the scope is limited for regulators to fully standardise and bridge the gaps between reforms as they are continually published, even with prominent attempts – such as the LEI – to do so.

Whatever happens, individual banks are all unique and so standardised models must also be flexible. Standardisation can only go so far when faced with these issues, and PJ Di Giammarino, CEO of JWG, was keen to call this out – “there isn’t a common rulebook. Everyone has to find a common source”.

Overall, in a short time, a broad range of issues around finding the right fit for standardisation, responsibility and quality in the realm of data management was covered. The conversation drew to a close with Dave Robinson, Managing Director of Trillium, reminding attendees of the complexity faced by data professionals and vendors, and the politics of what is and is not expected between them. Further dialogue is needed to hone in on an organised solution for managing increased regulatory data, but a one-size-fits-all solution can only go so far.

To promote global dialogue on how to deliver regulatory change JWG post hundreds of focused articles a year to thousands of subscribers. Get involved and join the mail list.

By hitting the subscribe button you agree to our Privacy Policy