Blockchain – Several more nails in the post-trade coffin
Although it seems like decades, the Blockchain saga is little less than two years old. The first (golden) period was typified by massive hype, some justified but some completely outrageous. After a short lull, during which a raft of Proofs of Concepts (POCs) were announced, serious institutions, such as central banks and settlement agencies, started to take the claims/hype seriously and have begun to test the assertions made by Blockchain proponents against reality.
And it is not looking good for Blockchain, also known as distributed ledger technology (DLT), and in particular the supposedly stand-out use case of ‘post trade’ processing.
Recent Analyses of Blockchain
In the third quarter of 2017, a number of reports were produced by expert bodies that hammered a few nails into the coffin of some of the wilder claims made for the benefits of Blockchain. First, the bankers’ banker, the Bank of International Settlements (BIS) published a detailed analysis of ‘central bank crypto-currencies’ which concluded that, unless anonymity is a prerequisite (other than for money-laundering, of course), satisfactory solutions already exist. Around the same time, the European Central Bank produced a comprehensive analysis of the “potential impact of DLTs on securities post-trading”, hammering another nail into the Blockchain coffin, concluding that
“It is important to note that a number of elements of a theoretically DLT-enabled financial market have to be properly designed and put together before DLT adoption can be considered a realistic possibility in the securities settlement space”.
In short, these reports found that the technology behind DLT just isn’t there yet.
Nor are these experts alone. The reports, and more nails for the Blockchain coffin, keep coming.
Appendix A summarizes the results of Proofs of Concept (POCs) from SWIFT and the Bank of Canada's Project Jasper, and surveys by the Rand Corporation on behalf of the British Standards Institute (BSI) and a global bench-marking study by researchers at the University of Cambridge.
As documented in Appendix A, a lot of serious thought and money has gone into these projects and studies and they were based on genuine (and expensive) efforts to test the viability of Blockchain solutions. The results may be summarized as follows:
- With existing DLT systems[1], it is indeed possible to build complex proofs of concepts for real and important use-cases[2];
- However, these POCs prove to be useful only when traditional Blockchain constraints are relaxed, such as no central control[3] and the need to ‘partition’ DLTs for confidentiality purposes[4];
- The studies all concluded that DLT technology is immature and would not replace existing operational systems, not least because of the problems engendered by relaxing the constraints, which create new ‘data resilience’ problems;
- No tangible cost savings were found nor claimed by the studies, except for unquantified savings by reducing/eliminating ‘reconciliation’ costs elsewhere in the financial 'ecosystem';
- Not all participants in financial markets are equal, with major financial institutions having already solved all of the problems (and more) that are being addressed by Blockchain – to quote the SWIFT conclusion, “one size will not fit all”!
Collectively, and from different perspectives, these studies and POCs hammer several rather large nails into the Blockchain coffin.
Of course, anyone who has read Dracula knows that it is hard to keep a good cadaver down. And just when one thought that hype was dead and buried, along comes Deloitte with a report that, although excellent in many ways, resurrects some very dodgy figures from a report by Goldman Sachs at the height of the “Peak of Inflated Expectations” hype cycle . This report claimed that “the banking sector could achieve a 10% headcount reduction and a 30% decrease in transaction monitoring with the use of Blockchain technology” and thus could produce overall operational savings across the industry of $2.5 billion (although 'how' was not explained).
Although no hard numbers were produced to back up these estimates, such savings are not to be sneezed at. So how would the savings be made,
“Employing a distributed database system like Blockchain across organizations can substantially reduce the need for manual reconciliation, thus driving considerable savings across organizations.”
So again, reconciliation is the magic bullet that will make Blockchain work?
Reconciliation Savings?
In the absence of any other identifiable cost savings, ALL of the studies documented fell back on the default meme that cost savings will come from reduced ‘reconciliation’ costs. This was summarized by Scott Hendry[5], in describing the Bank of Canada’s Jasper product, who noted that the “Proof of Work Ethereum system unlikely to be more cost effective than current system” but concluded
“Most cost savings unlikely to be in core system itself; most savings likely to come from bank reconciliation efforts no longer required”
So, it all comes down to this!
After thousands of white papers, dozens of proofs of concept and innumerable blog posts about DLT in financial services, all of the hype all boils down to the belief that billions of dollars will be released across the industry by automating reconciliations and thus firing thousands of staff. In short, the argument is, “we cannot save any money but we are positive you can”.
Sorry to disappoint, but that is just another fairy tale.
First, anyone who has ever worked in actually automating banking processes will know that:
- In modern banking operations, the actual number of staff involved in reconciliation efforts is relatively small, usually a tiny highly-skilled group using sophisticated reconciliation/trade matching software to track down errors;
- Automation of reconciliation processes has been around since the 1980s[6], and a quick look at the excellent Bob’s Guide to financial technology, shows over 40 mature product offerings in this space; and
- If bank management believed that if 10% of all staff costs could be cut by merely implementing a flat file database, it would have been done years ago.
The savings claimed are a fantasy, but what is more interesting is that the problem of reconciliation is not a database problem that is solvable by DLT, but is a data quality one.
Reconciliation is needed because humans are involved and humans make mistakes, and reconciliation is needed to identify mistakes made, either in the organization or by a customer/counterparty.
For example, imagine you are paying a utility bill online, but enter the incorrect transaction reference (say transpose a few digits). In most modern operations, a correct bill payment will be ‘matched’ automatically with no human intervention but your mistaken payment will typically pop up on a reconciliation clerk’s screen as ‘unmatched’.
However, with modern reconciliation/ matching software[7] a small list of possible ‘hits[8]’ will usually be provided (on-screen), and a clerk will, 9 times out of 10 hit the ‘OK’ button. In 1 in 100 or 1,000 cases, the clerk may have to delve deeper and query internal databases and/or start a conversation (online or by telephone) with a customer.
When resolved, automatically or manually, the records will be matched and updated, with commentary.
Note, not only does DLT not aid this process (the data is wrong before it is stored), it actually makes the situation worse, because the much-vaunted ‘immutability’ of DLT means that mistakes must be retained on the database forever (not even marked as ‘wrong’). In looking at the DLT database, it will not be possible to identify any wrong data without trawling through the entire data, and even then it will be problematic.
So how can mistakes be reduced - the operational risk problem? By standardization and quality control!
Some futurists insist that DLT will somehow facilitate standardization in the financial. It certainly won’t harm efforts to standardize but the Finance sector is already awash with data standards[9] and it is again not the data storage that is the problem but rather the changes needed to complex application systems that must be adapted to capture and process the necessary data. And improving data quality is a Six Sigma-like process not a database implementation.
The SWIFT report highlighted the problem of “data resilience”, or what happens if/when something goes wrong in processing by one or more of the Blockchain participants,
“Since data is physically segregated to ensure data confidentiality, one node can no longer recover from any node. Instead, it will need to rely either on some local resiliency setup within its own institution, on its counterparty’s nodes or on a central service”
In Blockchain speak, this means there is no “golden copy” and in order to recover from problems, backup copies must be maintained, and updated! One wonders why it took a proof of concept to identify this pretty big hole, since it is blatantly obvious if one thinks, for more than five minutes, about the problems inherent in operating complex ‘production systems’ .
This highlights the systemic risks in Blockchain adoption. Far from being more resilient than a centralized solution in fact, because of the ‘network effect’, a DLT system will only be as resilient as its weakest link(s). If a weak link pollutes the DLT database, first it may be hard to detect, requiring peer-to-peer (rather than central) reconciliation and then extremely difficult to correct (if it can be corrected at all, because of ‘immutability’).
By its nature Blockchain is a ‘tightly coupled’ system, or one where all components must work in very close collaboration, all of the time. A US expert on nuclear accidents, Charles Perrow[10], highlights the serious problems that such complex, tightly coupled systems can create whenever an error occurs and “non-linear failures can cascade in an unpredictable way because components are closely interlinked”. The systemic operational risks inherent in a tightly coupled Blockchain are enormous and have, as yet, not be identified or quantified.
Australian Stock Exchange (ASX)
The Goldman Sachs report, referenced the Australian Stock Exchange (ASX) as one of a small number of exchanges that, in mid-2016 was “trialing Blockchain technology for post-trade services for cash equities”. Eighteen months later, the exchange is still trialing the technology[11] to replace its aging CHESS settlement system, and has announced that a final decision will be made in March 2018, almost two years after the trial began and almost a year after the original decision date. The analysis noted that “ASX estimates that a full implementation of Blockchain in Australian equities post-trade could save the wider industry A$4-5bn in total costs”, although no detailed definitive numbers have been produced to substantiate such a claim.
The CEO has recently downplayed the savings to “hundreds of millions of dollars in costs from the back offices of brokers”. But this may have been in response to an industry consultation which was somewhat less than enthusiastic with for example, custodians not seeing “any significant benefit and cost savings in some of the new functions highlighted – with the exception of ISO 20022 messaging”. And neither custodians nor systems providers saw many, even any, benefits given the potentially significant implementation costs.
Since the ASX CHESS system is a (probably one of the) critical piece of financial infrastructure in the Australian financial system, it is important that the financial markets infrastructure (FMI) regulator, the Australian Securities & Investments Commission (ASIC), is given sufficient detailed information to assess costs and benefits to investors and importantly the risks of implementation, before rather than after any decision is made.
Given the doubts that have been expressed by several recent studies as to the immaturity of the technology and the palpable lack of detail on cost and benefits, any decision to proceed with the ASX Chess replacement must do so on very firm ground, and with an understanding of the systemic risks involved.
Appendix A – Recent reports on Blockchain
This appendix summarizes reports on Blockchain published in the second half of 2017.
A.1 SWIFT Nostro/Vostro Proof of Concept
Normally any mention of the SWIFT organization and Blockchain in the same sentence has the Blockchain twitter-sphere in a frenzy, but a recent (interim) report by SWIFT was pretty much ignored. The report asked the question
“Can Distributed Ledger Technology (DLT) – commonly known as Blockchain technology – help financial institutions optimise the liquidity of their Nostro accounts and reduce the significant operational costs associated with reconciliation?”
Before giving an answer, it is worth considering the problem being tackled.
The Nostro/Vostro problem is, in fact, the uber use-case for DLT and one would be tempted to conclude that if DLT cannot support this particular use-case, its usefulness elsewhere in finance is likely to be severely limited.
A Nostro account (from the Latin for ‘ours’) is a (standard) bank account held by one bank in another, usually foreign, bank. For example, Barclays will have a US dollar (USD) Nostro account with one or more US banks. Likewise, JPMorgan will have an AUD account with an Australian bank. So, banks will, depending on the overseas business they conduct, have dozens, maybe hundreds, of Nostro accounts, around the world, mainly for Foreign Exchange transfers. Likewise, a ‘Vostro’ account is the term given to ‘your’ account with us.
There is no centralization in the FX market, except for some settlement of FX contracts through the CLS Group, but not for payments. Payments are made on a peer-to-peer basis and the numbers can be huge so keeping track of payments made and real-time balances held in Nostro accounts is very important.
Custom made for a DLT solution? Well, turns out not to be so easy after all!
In order to test the applicability of using DLT to tackle this important problem, a very serious POC project was created with participation from SWIFT, the R3 consortium and eventually 33 (yes 33) international banks.
Unlike many other POCs, this project had a head’s start. For example, the data standards (ISO 20022/SWIFT) and participant identifiers (BIC/SWIFT) were already in place and used by, and familiar to, the participants. The network used was not the Internet, but SWIFT’s own high-speed, extremely secure network, which also provided the security through the “SWIFT controlled Certification Authority”. No Proof of Work consensus was used, but a much simpler (communications focused) Practical Byzantine Fault Tolerance (PBFT) protocol.
First the good news, the (preliminary) conclusion was that, as defined by functional requirements, the DLT-based Nostro solution was adequate and importantly
“For liquidity, the DLT based solution[12] provides real-time visibility across accounts to both the account owner and the account servicer on account’s entries status, derived expected and available account balances and timed data”
In short, another Proof of Concept worked.
Then the analysis. After noting that the largest banks have already implemented real-time automated data capture from the different high value payments systems to which they are connected, then
“The added value from a real-time enabled DLT solution as a liquidity user for these banks would therefore be rather limited unless they don’t have a centralised and aggregated view or are not feeding their internal entities in real time [emphasis added]”.
The report noted that for 'mid-tier' banks that have not yet implemented real-time liquidity monitoring solution, “a DLT based solution could provide these banks with a cost effective real-time liquidity and reconciliation capability”.
In short, not needed for large banks (as the functionality is already largely automated) but something might be useful for smaller banks – maybe helping with ‘reconciliation’?
But it gets worse! The (interim) report unequivocally concludes “A ‘one size fits all’ DLT solution will not work”, because it makes no sense for the big players to change a model that already works better, and smaller firms probably couldn’t afford the disruption or get the benefits. The report hammered a pretty big nail into the DLT coffin:
“The underlying technology should not be a factor in determining if the solution is fit for purpose. Additionally from a service consumption point of view, any final solution would be presented through the usual messaging or API interface with the underlying DLT technology upon which it is based being completely transparent to the consumer”.
At best the SWIFT Nostro POC showed that further experimentation just might yield some areas of potential usefulness for Blockchain for some smaller market players. However, the jury remains out even for that and certainly the results cast severe doubt on the ability of Blockchain to help solve a much more complex problem, that of 'post trade' processing for securities.
However it should be noted that the report was 'interim' and the final report is due to be produced in December. But, it is going to take someone finding a pretty magical missing key to unlock the problems identified. Expect a few more nails to be hammered into this particular coffin whenever the final report is published?
A.2 Project Jasper – Bank of Canada
Project Jasper is a Bank of Canada’s project to “build and test an experimental wholesale interbank payment system, using distributed ledger technology (DLT)”. That is, Jasper is a central bank Proof of Concept (POC), in a series of “phases”, of the potential uses of Blockchain in payments systems,
In September 2017, Payments Canada, the operator of the Canadian Large Value Transfer System (LVTS), produced a report on Phase 2 of Jasper describing in detail the project to test the use of DLT for “Domestic Interbank Payments Settlement”. The POC, based on Corda, used a “notary” function as a consensus mechanism[13].
In discussing the progress of Jasper in May 2017, Carolyn A. Wilkins, the senior deputy governor of the Bank of Canada, posed the question - "could DLT underpin an entire wholesale payment system"?
Her answer was – “maybe one day, but there remain many hurdles to overcome”.
On the upside, the project showed that a DLT system could meet “some, but not all, of the core international principles for financial market infrastructure” and also that payments could be netted in an “innovative liquidity savings mechanism”. On the downside, the report identified some significant hurdles.
In a summary of the project in June 2017, the project team leaders concluded that “for critical financial market infrastructures, such as wholesale payment systems, current versions of DLT may not provide an overall net benefit relative to current centralized systems” because the current LVTS system is “a very low-cost system to operate and supports efficient use of collateral pledged by participants[14]”.
In short, the current system works well and there are no/few savings to be made by embracing DLT. However, the report claimed (yet again) that
“If a DLT-based settlement system is able to reduce back-office reconciliation efforts, significantly more cost savings could be realized across all of the participants, which would reduce the overall cost of operation”.
In October 2017, the Bank of Canada announced Phase 3 of the Project Jasper initiative which is to “experiment[15] with an integrated securities and payment settlement platform based on distributed ledger technology (DLT)”. The results of this particular Proof of Concept are due to be published in May 2018 and it will be interesting to see if their conclusions differ from those of the ECB noted above.
A.3 Rand Study
In October 2017, the Rand Corporation produced a report into research, funded by the British Standards Institute (BSI), into the “challenges, opportunities, and the prospects for standards” in DLT/Blockchain. The researchers were neither glass half-full not half-empty guys/girls, but laid out both opportunities and challenges. The opportunities were the usual: ‘trust’, ‘immutability’, ‘security’ (through cryptography) but also noted enabling “new economic and business models” and “new sources of revenue”. And, without supplying any evidence, claimed “providing efficiency gains (including cost savings) for businesses and end-users”.
The report also highlighted the challenges, in particular “the potential high costs of initial implementation, perceived risks associated with early adoption of DLT/Blockchain, and possibility of disrupting existing practices may pose significant challenges to businesses” and “in the absence of widespread DLT/Blockchain adoption, the broader economic impact of the technology in the medium and long term is difficult to determine”.
In short, DLT/Blockchain is risky and the costs/benefits, if they exist at all, are dodgy.
The final summary (Table 6 on page 65) is enlightening. While the ‘challenges’ relate to DLT, and to technology change in general, the opportunities do not mention DLT/Blockchain, nor should they as they apply equally to any use of technology, innovative or not[16]. For example, there is sufficient evidence that non-DLT technology, such as the Internet has, for example, enabled “new economic and business models” and there is nothing unique about DLT’s ability to enable “management of digital identity through public key cryptography”.
In other words, the report does not give any comparison of benefits, costs and risks of DLT against existing technologies, and in fact we have to take on trust that there will be some cost savings, if only for ‘reconciliation’. If not exactly a nail in the DLT coffin, the report left the claims for the technology on life support.
A.4 Global Blockchain Bench-marking Study
In September 2017, a group of researchers from Cambridge university, reported on a comprehensive study that attempted to benchmark Blockchain on a global basis by surveying Fintech start-ups, large public corporations and central banks. The researchers concluded that
“DLT as a whole is still lacking maturity and, in many cases, remains undeployed and unadopted. Issues related to scalability, privacy and confidentiality are slowing down technical advancement, whilst regulatory uncertainties and legal risks are looming large. The DLT landscape is fluid, highly fragmented, contested, and complex”.
Not surprisingly, the researchers found some differences between start-ups and established players. For example, central banks considered ‘immature technology’ to be the main inhibitor to widespread adoption of DLT whereas start-ups only rank the immaturity of technology to be the third biggest challenge, but a challenge nonetheless.
Here we are looking at costs, benefits and risks and the participants were sceptical about potential benefits
“However, some study participants explicitly state that they need to better assess whether using DLT provides more benefits compared to alternative technologies. This requires extensive costs/benefits calculations that are – as with any new technology – difficult to identify and quantify. Some [Public corporations] have already publicly expressed that the costs of using a distributed ledger for their envisaged use case would outweigh the benefits. As DLT is all about acceptable trade-offs, it may be hard to justify in some cases why a DLT-based system may be preferable to a more centralised architecture [emphasis added]”.
In short, DLT technology is so immature that it is difficult to do any reasonable cost/befit analysis. However, like the other reports, the Cambridge study concluded, without giving any evidence, that any cost/benefits of DLTs will likely come from “their automated reconciliation mechanisms, their transparent nature, and their resilience”.
[1] Such as Hyperledger and Corda.
[2] However, this should not be surprising, as it is only computer coding after all
[3] For example, the Bank of Canada Project Jasper, used a central 'notary' function.
[4] As in the SWIFT POC.
[5] Senior Special Director, Financial Technology (FinTech) in the Funds Management and Banking Department (FBD) of the Bank of Canada
[6] In fact, the concept is so long in the tooth that SWIFT has just retired its heavily used ACCORD matching system
[7] For example, see software (Note not a recommendation just an example)
[8] Modern matching systems use sophisticated techniques to match unmatched transactions based on the totality of the data in the payment, such as the customer identity, the amount(s) and/or the payment date.
[9] For example, SWIFT (ISO 20022, FpML, FIX, LEI (ISO 17442:2012) and many others
[10] See Perrow, C (1999) Normal Accidents: Living with high-risk technologies, Princeton University Press,
[11] The technology is based on software provided by Digital Asset, a US supplier of Blockchain consultancy and software
[12] The POC was based on Hyperledger Fabric v1.0.1
[13] Note while DLT does, because it is distributed, increase the resilience of the data store, the report noted that whenever essential centralizing functions, such as notary and netting are added, the resilience is little/no different to a central solution. Win on the swings, lose on the roundabouts (English proverb)?
[14] The study noted that “it will be challenging for any DLT-based system to process payments more efficiently than the LVTS”.
[15] That is. another Proof of Concept (POC)
[16] It should be noted that claims for ‘smart contracts’ should be compared against the wide-spread use of ‘database triggers’ today.
??♂?The Worlds 1st Chief Generative AI Officer ?? 2 * Author ??? Keynote Speaker ?? 10x Global Award Winner ?? 7x LinkedIn Top Voice ?? 50k+ LinkedIn Connections ?? KieranGilmurray.com & thettg.com
6 年Great commentary and analysis. This technology has indeed been hyped to extraordinary levels and your analysis is a super counter to that hype. I do think this technology has great potential. However, like any nascent technology solution, there is still a lot to understand and get right before applying it.
Risk Manager, Advisor, Lecturer and Researcher
7 年Very thoughtful analysis - Thanks for sharing.
Blockchain & Distributed Ledger Technology Innovator | Co-Founder & Head of Design at L4S Corp.
7 年Great analysis and commentary. The solution is in a radical redesign or the core and protocol layers. The potential savings come from the industry using a single set of code and protocols - like the internet for connectivity - and therefore not having to maintain and further develop large, functionally parochial, enterprise systems. Also the promise of near-real-time settlement has some significant balance sheet savings for the sell-side as well as creating new forms of liquidity and price transparency. None of these benefits can be realized with the current POC designs. https://globalinvestorgroup.com/articles/3688608/the-blockages-in-blockchain-bad-advice-bad-approaches-and-bad-designs
GROUP CEO - Global Leader in Analytics, AI, Data Science, Research, and Innovation
7 年Very interesting, though I am wondering if we are actually looking at blockchain the right way and if the use cases identified are the most suitable.