New Studies on an Equities Distributed Ledger Address Volume Concerns

Exchange traded products such as futures and equities can show extreme volumes of transactions each day – could a distributed ledger support that level of throughput, given concerns over the complex consensus
October 17, 2018 - Editor
Category: Technology

Exchange traded products such as futures and equities can show extreme volumes of transactions each day – could a distributed ledger support that level of throughput, given concerns over the complex consensus model for ledgers. Two new studies carry out detailed simulations to find the boundaries of ledger performance and show the way for future technology transformation in the capital markets.

In a traditional systems architecture a relational database stores the resulting trades from an exchange. Firms like Oracle and IBM have had plenty of time to figure out how to service high transaction volumes to avoid data loss. DTCC and other back-end processing firms such as CCPs are considering whether distributing activity on an exchange is better done using an equities ledger rather than current methods.

The perception of ledgers is that adding transactions is slow, needing a consensus process for entities on the ledger to agree to each new transaction. On coin based ledgers the update rate can be as low as one transaction per second to the dizzying heights of ten transactions per second, not sufficient to support a high volume exchange market. More background on the performance of many coin based ledgers can be found here.

Two new studies have attempted to pave the way for migrating an exchange onto a ledger, one by DTCC and the other by GFT. Working with Digital Asset and R3 the studies have created simulated market environments with entities representing the typical mix of an exchange, trading parties, and broker dealers. To give the DTCC study legitimacy they created a test environment with up to 170 nodes on the ledger network to represent the many parties involved, and used post-trade events to expand the scope of the business process.

Accenture and DTCC

In a 19 week study by Accenture with DTCC, using R3 and Digital Asset, the environment was designed to simulate the capture of matched equities trades from exchange DLT nodes, novation of those trades with DTCC acting as the central counterparty (CCP) to maintain trading anonymity on the ledger, creation of netted obligations and settlement of the trades. The test environment for this study was setup in the cloud. 

    In the Accenture / DTCC study, they aimed to validate whether a ledger could support 100 million trades per day, being the peak volume in the US equity market supported by DTCC now. The tests by Accenture with R3 and Digital Asset achieved a throughput of 6,300 trades per second for five continuous hours, giving a total volume of ~113 million transactions, well above the target level. DTCC noted that the study provided a starting point and only tested basic functionality. Additional work will be necessary for DTCC to determine if DLT can meet the resiliency, security, operational needs and regulatory requirements of its existing clearance and settlement system.

    This project answered key questions and built serious confidence in blockchain’s ability to drive large scale transformation,” said David Treat, Managing Director, Global Blockchain Lead, Accenture. “The close collaboration with the DTCC and our alliance partners, Digital Asset and R3, enables us to push DLT performance to new levels against real world requirements and conditions.” 

    DTCC has been actively involved in DLT projects for over 3 years and during that time, we have seen technology platforms continue to mature, but concerns have loomed around the scalability of DLT,” stated Rob Palatnick, Managing Director of IT Architecture at DTCC. “This study is a natural next step in our efforts to advance the use of DLT, and we look forward to continuing to work collaboratively with the industry to identify new opportunities to use the technology to enhance the post-trade process.”

    GFT and Digital Asset

    In the GFT study with Digital Asset they deliberately avoided short cuts such as injecting large numbers of simplistic trades or skirting data privacy to make the simulation realistic. The test network comprised of several nodes and software components, They are:

    • Trade injectors, which simulate the input of brokers by sending trades to the exchange.
    • Exchanges, which write these matched trades to the ledger.
    • The clearinghouse, which coordinates all the clearing and settlement activity.
    • Clearinghouse subordinate nodes, which confirm and novate the trades and perform a subset of the netting process. There is one dedicated clearinghouse node for each exchange.

    The GFT / DA simulation included processes such as:

    • Injecting trade confirmations from exchanges and registering them
    • Novating the trades
    • Netting them into DvPs
    • Settlement

    As well as measuring throughput of trade registration, DA also measured the time taken for the rest of the process: netting, novation, and settlement. At 27,000 transactions per second, it took 2 minutes to subsequently process netting and settlement across all participants. In total 250 million trades were processed for the registration, novation netting, and settlement transaction lifecycle.

    Blythe Masters, CEO of Digital Asset, said: “GFT’s performance test demonstrates the throughput capacity of the DA Platform can meet the demands of major markets. We are delighted that GFT’s findings have validated that the highest standards of integrity and privacy do not sacrifice performance. We are very optimistic about what this outcome will mean for the industry at large.

    David Collins, Head of GFT’s Atlantic region concluded, “GFT’s goal is to continually drive innovation throughout the financial services sector and DLT is a key part of this strategy. We believe the rigor of the test scenario’s construction and in the level of performance achieved demonstrates the DA Platform can handle peak trade volumes seen in US equities markets and scale to satisfy the performance requirements of large-scale financial institutions. The results of our performance tests will generate new opportunities for our respective firms and we look forward to working closely with Digital Asset to accelerate the adoption of what we believe is a completely transformative capability for the business.”


    Background on both studies can be found here:

    • DTCC study here
    • GFT study press release here
    • GFT study explanation here

    Why does this matter?

    The capital markets provide an extreme test for ledger technology, especially with high volumes and a complex network of participants. Equities and futures are the highest volume environments on the planet, if a ledger can support those markets then it can support any other. The perception that ledgers are slow is reversed by these studies, although implementing this technology for real is yet to come.

    Axoni and DTCC are in the late stages of testing their replacement for the DTCC Trade Information Warehouse. Axoni is providing the underlying ledger technology for the TIW, which processes all (or most) of the credit default swap market in the world. Digital Asset are working with the Australian Stock Exchange to replace the technology platform for equities. This study leads directly into that project, which has some time to run before going live. If the DTCC TIW and ASX Equities projects break the ground for real-world processing of CDS and Equity markets, it opens the door to applying ledger solutions for other OTC markets and the potential for a revolution in how the back-office handles these markets. 

    Firms mentioned

    Most Viewed