Data Integrity is Solved by DLT, Right? Wrong!

June 2019 The OTC Space met with Frank Glock from Gresham Technology, which recently announced a £1.2M contract with a leading CCP to provide improved reconciliation and control tools, to
June 17, 2019 - Editor
Category: Technology

June 2019 The OTC Space met with Frank Glock from Gresham Technology, which recently announced a £1.2M contract with a leading CCP to provide improved reconciliation and control tools, to talk about data integrity, distributed ledgers and regulators. Frank’s views on new technology challenge the orthodoxy of “move everything to a ledger”. We covered data formats, systems integration, ledgers, central infrastructure and the unavoidable complexity of systems.

Bill: What are the underlying reasons why data integrity breaks within firms?

Frank: The fundamental connections between banks, asset managers, intermediaries, the supply chain, or inter-system within firms leave businesses at risk of breaks as a result of data integrity issues. These include regulatory breaks, misreporting, corruption, fraud and errors. Because large institutions usually operate scores, if not hundreds of systems and end-user controls within their businesses, as well as needing to connect to other third-party services/providers, it’s rare that a consistent, transparent view of their data is possible or that the necessary data integrity procedures are in place.

The volume, complexity and speed of data being processed by firms has exploded in recent years, which undoubtably has severely disrupted business-as-usual in the financial sector. Multiple data sources, combined with complex, federated system environments, and millions of algorithmic-trading transactions and confirmations, all create a significant amount of unstructured and unwieldy data sets for a firm to stay on top of.

Bill: In the long term, what strategy should firms adopt to avoid data integrity problems?

Frank: A robust data integrity strategy demands we rise to a higher standard and reinvent the way data control and reconciliations are viewed within a bank or buy-side firm.

Of course, traditional steps within a traditional reconciliation model —harnessing the data, performing matching, identifying and resolving exceptions, compiling reports—are still present, and indeed necessary. But in today’s environment, simply running this process to a static end-result is insufficient.

An effective strategy requires a platform that processes data more quickly and consistently, with connectivity to a wider array of systems, processes and data formats across streaming as well as persisted data stores. It needs to do this while caching underlying computational elements, ensuring data lineage, and providing built-in analytics to allow personnel to flexibly slice and dice the results.

A platform needs to fit within the internal enterprise data management (EDM) frameworks many banks have now developed and help meet global data governance standards like the BCBS 239 Principles. Agility is absolutely key as the demand to change controls quickly is now the operational norm. And, any strategy must involve the exploitation of emerging technology like automation, artificial intelligence and distributed ledger.

Bill: What are the barriers to moving data around within corporate systems?

Frank: Vast volumes of data and the complexity of the formats are the primary challenges to achieving data control and integrity. Each system in a complex chain (of tens to hundreds) performs its own task but the business, and regulators, need to ensure that the operational data is controlled across the enterprise. This is extremely complex as most capital markets data architectures combine banking systems, bespoke and off-the-shelf trading systems, spreadsheets, data lakes, streaming transactions and so on.

Bill: What makes systems integration so hard?

Frank: The sheer volume of data housed within large organisations is the first challenge. This data can be extremely disparate in terms of where it comes from, its format, how it is recorded and stored, and what it is used for. Systems have also changed over the years, so organisations commonly have multiple data tracking processes, which may not be easily integrated.

The siloed approach within financial institutions also brings challenges. Each department, desk or process will often have developed unique legacy systems, with varying ways to quantify and classify data, all of which has a serious impact on a firm’s ability to integrate its data.

Bill: Which regulations require delivery of high-quality data?

Frank: High quality data underpins many of the regulations in our industry, such as MiFID II, EMIR, The Senior Managers’ Regime, PSD2, IFRS 9, FRTB, CECL and more all require improved data management and integrity and changes to dedicated data subsets.

The BCBS 239 principles (“Principles for effective risk data aggregation and risk reporting”) were designed to strengthen banks’ risk data aggregation capabilities and internal risk reporting practices to enhance risk management and decision making, which sets a minimum standard.

Overall, regulators are insisting on more detailed, accurate and timely reporting. Transaction reporting is an obvious one that has significant data implications as firms are required to provide standardised data to external sources.

Bill: What is the view of regulators on data integrity?

Frank: The FCA has data integrity as a clear focus as do all of the regulators around the world. FCA Chief Executive Andrew Bailey on the major changes resulting from MiFID II will improve “market integrity in wholesale markets”. One consequence, he says, is the “strengthening of the transaction reporting…regimes” and that the FCA “will closely monitor how well firms are complying with these new requirements”.

MiFID II – the FCA intends to focus on market cleanliness, leveraging the more granular transaction reports that MiFID II requires. Another priority is to look closely at the operation of the primary and secondary markets where transaction reporting data, if it is of high quality, will be useful in its analysis.

Market integrity

MiFID II’s expanded scope allows the FCA to use “indicators such as market cleanliness data and the number of suspicious transaction and order reports to assess the potential harm from market abuse which damages confidence and participation in the market.”

RegTech developments

The FCA is driving sandbox and tech-sprints to develop new technology solutions like advanced analytics artificial intelligence as part of the supervisory process to detect areas of non-compliance.

Bill: How do modern data formats like XML, ISO20022, FpML and FIX contribute to better data integrity?

Frank: Standards help, but many legacy systems are hard wired into proprietary or outdated formats. Replacing existing systems is simply too large a cost and operational challenge, which is why we developed Clareti to sit above legacy software.

Bill: How might a distributed ledger solve data integrity challenges?

Frank: It won’t.

Data integrity can be measured across many axes, but a common approach is to determine if the data is timely, accurate, complete and consistent. A distributed ledger can certainly solve the consistency axis if all systems share the distributed ledger but does close to nothing for the other axes.

Our research shows that data is failing integrity tests across each of these independent axes. An investment in a distributed ledger to solve the integrity problem is a poor investment.

Bill: Do central platforms like the DTCC TIW or LCH SwapAgent eliminate data integrity issues by being a golden source?

Frank: No for the reasons given above. At best they will solve the consistency element.

Bill: What is the Clareti platform?

Frank: Gresham’s Clareti platform overcomes these challenges by bringing control to the most complex data processing environments. Clareti allows controlled automation to vastly speed up data flowing through complex system by leveraging advanced, rule-based matching algorithms. Control is evidenced and explicit enabling every step in the data flow to be fully audited by regulators, internal compliance and operational management. Exceptions are raised more quickly than with legacy systems and spreadsheets allowing individuals to take immediate actions to remedy.

All accessed via a visual dashboard to identify trends, monitor data integrity, control operations and track risks.


Popular
Most Viewed

Image