RegTech Intelligence

Regulating computer trading: the path to data standards?

The UK’s Foresight Commission report on HFT has finally heard the industry’s call for clear, shared data standards across the financial system. However, it remains to be seen whether Europe – or the world – has the stomach to realise this vision.

After a series of dramatic computer trading glitches across the globe, most recently in Mumbai and New York, the world of HFT/AT is facing a time of regulatory reckoning.

As the most wide-ranging and incisive study in existence on computer trading, the report offers an overall ambivalent view of the pros and cons of HFT. As expected, the report offered its support to more immediate (and implementable) protections for automated trading, such as mandatory circuit breakers and minimum tick sizes. With MiFID II almost certain to call for circuit breakers and minimum tick sizes, and the CFTC due to issue a “concept release” of HFT regulation, this is sure to affect the final rules in some way.

[accordion][pane title=”Known unknowns”]

  • Are the imperatives for change strong enough to mandate meaningful standards?
  • Is Europe set to lead the globe in computer trading regulation?
  • Who will own and maintain computer trading data collection in Europe?


At its heart, the report draws some unexpected conclusions. Much of its core discussion leans more towards systemic problems with data, calling on regulators to address the lack of availability of “comprehensive and consistent” computer trading data across diverse trading platforms. In a surprising attack on the reality most practitioners face, the academics have denounced today’s trading data as inaccessible, difficult to digest and hard to understand. A major barrier to these recommendations will be the standardising of time stamps to create a ‘common gateway’ for overall market analysis. Solving this, the report argues, should be tackled by a coalition of regulators and academics in an Office of Financial Research (OFR) style European Financial Data Centre, which, they argue, could possibly ensure the data is concise, comparable and complete.

Clearly, this is a complex, ground-breaking and immensely tall order. Nothing in the regulatory world, not even the cries for identifiers, has addressed data quality and compatibility issues with this breadth and depth. In the shadow of this report, the actions that the industry takes to address trade data issues will set the tone. The extent to which market participants are willing to commit resource and intellectual property to this exercise is far from clear. Current piecemeal approaches to data quality divided by regulatory brick walls are unlikely to cut it. The most recent iteration of the Markets in Financial Instruments Regulation (MiFIR) text shows that ESMA has already aligned with some of the details in the Foresight report – for example circuit breakers. Additionally, the EC has assigned ESMA the arduous task of collecting large volumes of transaction reference data from trading venues, all while simultaneously ensuring standards for the “appropriateness” and “quality” of the data. Even with 59 additional staff for 2013 and a budget increase of €7.8 million, this will be no small challenge. Furthermore, given that the MiFIR focus is on transaction reporting to the regulator, the ‘big picture’ of how trade and transaction reports will get reconciled is a bit unclear.

To realise the ambition of this report, stakeholders in Europe would need to roll up their regulatory sleeves as a fresh, holistic approach will be required to elevate data standardisation. The reality of the situation is that a small core of ESMA supervisors in Paris is not going to be capable of carrying out the Foresight Commission’s recommendations without active engagement – and hard work – from market participants and their data vendors. Europe has the opportunity to lead the globe in standardisation, but needs to have an end-to-end view of the problem at hand. This means clearly identifying a singular actor in the EU to ‘own’ the conversation of data quality, including what data is required across sectors and infrastructures and what consistent, high quality data looks like.

A practical approach would see recommendations, such as the need for a timestamp, used as test cases for a new model of market collaboration. If successful, the project would outline clear, mandatory standards that are used for a variety of purposes across the financial sector, from trading to market abuse detection, to credit risk, to mapping, to systemic risk control. Most importantly, the project could produce a top-notch deliverable to a newly minted EFDC which they could maintain and disseminate.

Will we see Europe push forwards from theory to best practice? Who knows? The EU should relish this opportunity to leverage the report and establish its leadership in defining the landscape for data standardisation, not only for HFT, but across the globe. At the end of the day, there are a number of immediate regulatory concerns that are more likely to get the precious resource required. However, like the EU should be looking at the landscape, firms and their suppliers should be monitoring the political forces for change.


[pane title=”Themes”]

  • The UK has helped define the debate via the Foresight Commission
  • With MiFID II nearly finalised, HFT regulation will soon be hitting the rulebooks in the EU
  • The call for a new European Financial Data Centre (EFDC) could gain momentum.



[nktagcloud single=yes]

To promote global dialogue on how to deliver regulatory change JWG post hundreds of focused articles a year to thousands of subscribers. Get involved and join the mail list.

By hitting the subscribe button you agree to our Privacy Policy