RegTech Intelligence

Risk data strategy and rethink coming in 2013?

Managing, aggregating and maintaining risk data used to be a box-ticking exercise with easily achievable targets. In 2013, landmark new global requirements mean firms will face a big step up.

Over the past few years, regulation in the area of risk management information (MI) was fairly basic. In 2011, the FSA, like their US cousins, called for adequate policies, procedures and systems for risk supervision, with board-approved risk management policies. However, the Basel Committee did little to reinforce the global requirement for risk data. At best they called for monitoring of firm-wide and individual entity risk by the board along with “special attention” given to the quality, completeness and accuracy of the data.

The release of the Basel Committee’s draft principles on effective risk data aggregation and risk reporting in June 2012 ushered in a new era. In 2013 – a ‘first wave’ of new global risk data ‘commandments’ for regulators will begin to reset the debate with onerous ownership and content requirements, requiring real resources and infrastructure changes. Governance figures at the top of the agenda, with firms needing a back-to-front data aggregation management strategy, granular in detail and vast in its requirements. This includes board level accountability for risk data, such as management of data risks and risk mitigation measures, extending even across borders to outsourced “data-related” processes. As if that wasn’t enough, boards must also “independently validate” this information and ensure that a hard and fast timeline is established as to when risk data aggregation capabilities and risk reporting practices will be established in their own framework. Firms will also have to engage in holistically scoping their technical and legal limitations that could prevent full risk data aggregation.

[accordion][pane title=”Known unknowns”]

  • Will upgrading risk data standards force long overdue upgrades across firm infrastructure for tracking customer reference data?
  • How will stringent new global risk data aggregation principles affect firms’ data and IT infrastructures?
  • What form will potential penalties take for non-compliance?


Naturally, it isn’t just about ‘who’ owns the data, but about proving that a firm’s technical capabilities and operating model can automatically provide critical risk data at all times and in all stress scenarios. Integrated data taxonomies, singular identifiers across customers, accounts and counterparties consistent throughout the firm are all part of this tall order. The difficulty of this exercise becomes increasingly apparent as data is expected to be complete, timely and defined consistently through a “dictionary of concepts” across the firm. Clearly, this is not any old data aggregation policy of the past, and firms will need to step up and control their use of the disparate spreadsheets currently used to track the information.

When risk data is placed in a larger, holistic reference data conversation, it’s clear this is not going to be an easy task. Reference data forms the core of the risk data discussion, due to its foundational role in risk, finance and business and HR data. In this ‘first wave’, firms will need to be able to reference common identifiers across the institution in order to fully examine them and have an ‘apples to apples’ understanding of the issues. In order to do this, firms will need to define how their risk information is managed throughout the context of their entire firm. Firms typically suffer with little overarching strategy for how all of this information fits together: How does risk data align with finance data? How is HR information or commercial data involved? What does my firm-wide view of data look like from the centre of all of it?

The scope and breadth of changes required, including establishing singular customer reference data identifiers across a firm to a composite view of the reference data underlying risk, means that conversations about data quality and completeness need to be on the agenda for 2013. This means that firms will need to start thinking about just how well they can answer hard questions about whether they can aggregate and report their risk data to the standards outlined by the BCBS today and how they are going to step up to these changes tomorrow.

Penalties for failure in aggregation and reporting ability will predictably not be pretty. For example, a future ring-fenced institution in the UK under the Vickers Report, that might not satisfy a regulator’s view of adequate “resolvability” with its current risk data aggregation infrastructure and capabilities, might face additional capital requirements. Furthermore, lacklustre ability in the production of unified, high quality reference data might even lead to higher liquidity buffers under current FSA liquidity rules.

This is going to be a long, hard grind. Firms should not be lulled into a false sense of security by the Basel Committee’s 2016 deadline. With self-assessments to begin in early 2013, the board and senior management can anticipate difficult discussions with their national regulators.

[accordion][pane title=”Themes”]

  • Firms need to have serious conversations about their risk data strategy and target operating models with their regulators in 2013
  • Current risk data operating models will need to completely rethink how they govern, define, aggregate, report and track quality of risk data
  • Risk data is only one part of the overall reference data picture and will be need to put into a larger ‘data landscape’ before real cross-firm value can be obtained.



[nktagcloud single=yes]

To promote global dialogue on how to deliver regulatory change JWG post hundreds of focused articles a year to thousands of subscribers. Get involved and join the mail list.

By hitting the subscribe button you agree to our Privacy Policy