MiFID II data quality and reconciliation: urgent preparation needed

It was back in early February when the European Commission confirmed the introduction of MiFID II would be delayed a year, to January 2018. At that time, I recall a
September 7, 2016 - Editor
Category: MiFID II

It was back in early February when the European Commission confirmed the introduction of MiFID II would be delayed a year, to January 2018. At that time, I recall a general feeling of relief from most financial institutions facing a huge upheaval to be compliant with the new directive.

It was back in early February when the European Commission confirmed the introduction of MiFID II would be delayed a year, to January 2018. At that time, I recall a general feeling of relief from most financial institutions facing a huge upheaval to be compliant with the new directive.

Now that we’re a few months on, surely this delay has meant that most firms are well into their MiFID II projects and must be feeling that warm glow of confidence in being well inside project timeframes? It appears not. A recent flurry of articles, industry analysis and commentary seem to indicate that all is not well in the preparation of the new trade reporting requirements.

Data quality is a key concern

While the majority of attention has been given to the core trade reporting requirements, the data and technology at the heart of the reporting process itself seem to have been largely overlooked. Reporting is the mechanism but if the data that drives it is corrupt, and two parties are unable to reconcile that information, then we are looking at a potential crisis.

A recent OTC trade reporting survey by Catena Technologies (data used with permission) stated that 80% of respondents are extremely concerned about the quality of data being reported to the regulators.

Over half cited the issue of understanding the trade reporting requirements as the biggest challenge they face, while 38% of respondents highlighted position reconciliation as the main problem.

In terms of automation, over 50% admitted to generating Universal Trade Identifiers (UTIs) using a manual process and more than 90% had yet to fully automate reconciliation processes across all asset classes.

Issues with ISINs and LEIs

Not only does there appear to be a lack of true understanding, clarity and automated data reconciliation in place, there is also confusion over agreed data standards.

At this year’s FIA International Derivatives Expo, a panel of experts warned about the dangers of underestimating data requirements under MiFID II and the need for more clarification on standardised data coding structures. Specifically, there is concern about the overreliance on ISIN and LEI structures as the commonly agreed data formats. 

“We need to look at the attributes of ISINs and LEIs and see what works well,” commented Rory McLaren, SVP Regulatory Services at Deutsche Bourse. “It is important to remember that none of the current options were created specifically for MiFID II so ideally an alternative is needed […] It would be time-consuming but would lead to greater relevance in this key area.” 

In OTC markets, the migration to the newly-formed organised trading facilities (OTFs) will increase pressure on these venues to make
ISINs available to the relevant authorities and to ESMA itself.

The situation was summed up neatly by Rob Barnes, FIA’s Director of Regulation: “If firms have taken their foot off the gas, now is the time to step back on it and ensure that they are ready. Under MiFID II regulators are unlikely to be generous toward firms that miss their deadlines.”

So what reasons are behind this confusion and lack of standardisation?

Data silos and legacy systems

The vast majority of data that needs to drive the whole reporting process traditionally resides in multiple back office systems. Irrespective of asset class, the bulk of these systems are old, disparate and only connected to the rest of the firm by an eye-watering number of interfaces built up over many, many years.

Attempts by the industry to remove silos and look for a common cross-asset platform have failed in the main, while investment in post-trade technology has been low for the past 20 years.

Much of the budget spend is on costly upgrades, under the “run the bank” banner of ongoing maintenance for this ageing technology. It is ironic that some commentators blame this situation on the pre-2008 era of light-touch regulation that dominated the 1990s and early-to-mid 2000s.

To compound this, the state of industry reconciliation platforms is not much better. Like a number of sectors, the competitive landscape broadly consists of a couple of legacy players fighting it out, tethering the industry to multiparty utility models. Just about as far away from “agile” as you can imagine.

Manual processes and workarounds

So two key areas of regulatory adherence reside in old and stretched platforms, typically built for one purpose and then overextended to keep pace with ever-changing requirements. In the reconciliation space this tends to result in spreadsheets and manual work being used to fill the inevitable gaps.

While a firm’s central reconciliation unit can be overstretched by a backlog of changes, upgrades and maintenance, the default stopgap is to employ people working with endless spreadsheet user developed applications (UDAs). As the number of people increases, this is often outsourced to an offshore centre. The cost base grows in proportion to the lack of data controls.

In the old world of light-touch regulatory oversight, this legacy system/spreadsheet hybrid was able to survive, and is still the norm for many firms large and small, buy and sell side, and even the trade repositories themselves. Up until around eight years ago, the biggest penalty handed out for firms using this model would have been an audit black mark and a “must improve next year” admonishment.

Times change, however, and since the global financial crisis there have been an increasing number of substantial fines handed out to firms and, in the majority of cases, the root cause has been poor quality, unreconciled data. Presumably the advent of MiFID II will give regulators even more scope to take a hard line with non-compliance.

MiFID II: a compelling case for change

In many firms MiFID II is being tackled with furrowed brows, enormous budgets and huge teams working on lengthy projects. But this does not need to be the case.

The last few years have seen a boom in fintech companies, many addressing these legacy challenges. Central to the majority of these new entrants is the notion of agile projects where timeframes and overall costs are slashed by a magnitude.

Intelligent, secure, audited tools that can be used immediately by subject matter experts are the order of the day. Results are delivered in hours not weeks and bypass the lengthy and traditional reliance on central IT departments.

It is often said that there needs to be a significant event for someone to change the way they think and adopt a new approach. For the life of me I can’t imagine a more compelling event than January 2018 to ensure not only preparedness, but also the establishment of a solid and secure data reconciliation platform to drive the new trade reporting demands of MiFID II. The adage of “rubbish in, rubbish out” remains a very real danger.

Time is of the essence

So there’s now under 18 months to go. If we factor in six months to allow for multiple system UAT programmes, that leaves firms with less than a year to be compliant with MiFID II trade reporting demands.

From the statistics above it seems we are staring down the barrel of a crisis. Given the overall health of banks in general, I’m fairly sure that huge fines are not what’s needed both in terms of financial penalties and damage to reputation/sector confidence. 

The billions of dollars’ worth of investment in fintech has produced some real world solutions to real world problems. Technology booms can also create hype but we now see many proven solutions being used in the post trade space.

Rapid adoption is now crucial with an impending deadline. Even the FCA is seriously considering a cloud-based solution for storage of the new data fields. A year ago and this would have been absolutely unheard of.

As the late and former leading political figure Tony Benn once said, “It’s the same each time with progress. First they ignore you, then they say you’re mad, then dangerous, then there’s a pause and then you can’t find anyone who disagrees with you.”


This article was first published in edition 7 of Rocket, our magazine. Download available Rocket editions here, and save your up to date address in your profile to to indicate your interest in receiving a printed copy of the magazine. Copies are also available to purchase and subscribe to via the shop.

To save your address into your profile:

  1. Visit the home page
  2. Click Account (in the middle of the row of black buttons)
  3. Click Edit Profile (in the row of buttons at the top)
  4. Click Reader (top right)
  5. There you can see your profile, with a box for your address – complete it accurately, and click Save

Popular
Most Viewed

Image