The Fundamental Review of the Trading Book (FRTB) final text was published in January 2016 by the Basel Committee (BCBS). The aims of the FRTB BCBS standards are to better factor market risk into trading book risk models, and to prevent banks moving instruments between the trading book and the banking book in order to reduce capital requirements. The final implementation date for banks will be January 2019. Whilst that is (at the time of writing) 32 months away, considerable work will be required of banks in order to comply with the high data standards imposed by FRTB. Such efforts will put pressure on banks, especially when you consider the vastness of what is required for firms under MiFID II in 2018, MAR only 1 month away and ALMD IV implementation in May 2017.
At the same time as the FRTB publication, new data aggregation standards were introduced under BCBS239. These require banks to put in place strategies to meet principles on data governance, data aggregation and reporting. The BCBS standards will require banks to review the IT infrastructure, governance, IT and data aggregation policies and procedures and the quality of data gathered. The standards adopt a principles-based approach leaving firms frustrated by the lack of guidance on acceptability of compliance.
As part of setting the landscape for what is to come under FRTB from a data perspective, JWG’s Customer Data Management Group recently met to discuss risk data regulation (BCBS239, FRTB and Capital Requirements Regulation). What was clear from the session is that much work is required in order to meet the requirements but, as outlined below, BCBS239 compliance will leverage data standards and assist in FRTB compliance.
FRTB sets data standards
FRTB will require banks to gather data for trading desks, and use that data to support and analyse risk factors to be input into risk models for the requisite trading desks and financial instruments. The quality of data (completeness, integrity and accuracy) will determine if the risk factors are modelable and, if so, the firm can adopt an internal model which will result in lower calculations of risk and, therefore, the requirement for less capital to safeguard against losses. If there is not sufficient data, the risk factors shall be determined as non-modelable and firms will have to use standard models developed by BCBS resulting in higher levels of capital being put aside.
Firms will have to set a high level of data quality, accurateness, completeness and integrity if they are going to be able to demonstrate to supervisory authorities that their factors are modelable. This will push firms to ask questions about their current data governance frameworks, aggregation and validation capabilities. Meeting the standards developed for FRTB, whereby firms have to set data aggregation standards, can be used as a start point towards BCBS239 compliance.
FRTB versus BCBS239
BCBS239 lacks guidance on the acceptability of the competent authority of the standards and firms are struggling to understand what is good practice for when the regulator decides to undertake an investigation or ask questions.
Because of the stringent standards for data under FRTB, and a lack of guidance on quality under BCBS239, a cost benefit approach could be adopted to determine which regulation to focus on, the concept being that, if the firm focuses on FRTB in order to establish data standards, they could roll these standards to other programmes throughout the firm.
But there could be danger that, with the specificity of FRTB focusing on market risk, other areas of BCBS239 and, therefore, other areas of the firm, may be ignored.
Governance is the key
BCBS239 sets a high bar for data standards, with several areas potentially requiring considerable change, such as the aggregation and reporting of data and firms’ IT infrastructures, so firms may want to first look at their governance structure.
It may be argued that, by getting the governance structure in place, and ensuring appropriate policies, procedures, systems and controls exist, other challenges, such as aggregation and reporting, may be lesser as the oversight is in place for such processes. However, if you don’t get governance right, then aggregation and reporting will be much harder – if not impossible – to achieve. Once a robust governance framework is in place, it should be much easier to set the data standards across the firm as a whole and to manage any changes that are required.
Jurisdictional differences: fly in the ointment
Jurisdictional variances are a perennial issue when implementing regulation. Under FRTB the definition of trading desks, prescriptive and specific, contrasts with the definition of the trading desk under the Dodd-Frank Act Volcker Rule. When setting standards under FRTB, firms will have to reconcile their standards with other regulations.
Firms will have to ask themselves how they want to approach variances – on a regulation-by-regulation, jurisdiction-by-jurisdiction basis, or set a ‘gold standard’ approach to act an umbrella for all jurisdictions, branches and subsidiaries.
Such challenges highlight the need to set standardised definitions, enabling firms and impacted parties to efficiently and cost effectively achieve data regulation compliance.
FRTB additional challenges: infrastructure and data retrieval
FRTB will require firms to move from a Value at Risk model (VaR), a statistical technique used to measure and quantify the level of financial risk within a firm or investment portfolio over a specific timeframe, to an expected shortfall model that is more sensitive to the shape of the loss distribution in the tail of the distribution. The consequence of such a change will mean firms having to source greater levels of data with more granularity. Banks will have to run multiple full revaluation scenarios in order to calculate expected shortfalls, taking into account different liquidity horizons, as well as constrained and unconstrained diversification configurations. It is expected that this increase in the number of calculations will place significant strain on IT infrastructures and their processing speeds.
Under FRTB banks will have to retrieve, aggregate, cleanse and consolidate data going back to 2007. The data will enable banks to establish extreme events, such as the sovereign debt crisis, which will then be entered into risk models and calculations. Challenges could arise due to legacy issues, business issues resulting from past mergers and acquisitions and the sheer level of data that will be required to be reconciled and validated. It should also be asked to what extent this approach is practical. Since 2007, trading desks may have been restructured, due to either regulation or business strategy, so how is this exercise appropriate for testing these desks and their risk models today?
Raising the bar
FRTB adds to the raising of the bar for data standards under BCBS239. FRTB implementation will test firms’ current data standards as well as those that aim to meet BCBS239 in the future. The incentive to ensure that FRTB standards are met – or exceeded – will need higher capital being set aside.
It remains to be seen to what extent FRTB will act as a guide for the less-prescriptive BCBS239 and the extent to which firms adopt such a strategy will be purely internal preference as opposed to an advised sensible approach to meeting requirements.
What is certain is that risk, IT, operations and compliance managers will endure sleepless nights whilst they try to meet modelability requirements.