Earlier this year, I was asked to moderate a panel at a conference hosted by the International Swaps and Derivatives Association on the evergreen topic of clearinghouse resilience. I agreed, partly out of curiosity, given that this was the beat I once covered extensively as a reporter on our sister title, Risk.net, and earlier in my career. Mostly, I was interested to see how the conversation had progressed while I shifted my focus to technology over the past few years. Not much, it turns out.
The topic had circled back, despite innumerable hours spent talking about it around 2015 and 2016, because of what happened at Nasdaq Clearing earlier this year. For those who don’t follow this rather arcane corner of the derivatives market, a single trader’s positions fell prey to rampant volatility in the power markets, and when said trader was unable to meet the margin call, it blew a hole in the default fund to the tune of more than $100 million, which member futures commission merchants (FCMs) were forced to cover.
The good old issues of membership criteria, transparency over margin methodologies, and skin-in-the-game duly resurfaced. And it seems that despite regulatory guidance on these topics from bodies such as the International Organization of Securities Commissions, a tug-of-war still exists between the clearinghouses and the FCMs. The more things change, the more they stay the same, to quote Alphonse Karr.
While default-management procedures are important to discuss—and, as the incident at Nasdaq Clearing shows, even more important to get right—there remain extant questions that need to be answered regarding non-default losses by both FCMs and central counterparty clearinghouses (CCPs). These are the types of hot water that a CCP can get into when exposed to losses that don’t occur as a function of its risk management procedures associated with handling derivatives trades. They include, but are not limited to, investment losses that occur when investing collateral—given CCPs are extensive users of the overnight repo market and are limited by law as to how much they can invest in commercial banks—and operational risks. Chief among those risks are systems and IT failures, and cybersecurity.
An extensive outline of what these non-default losses (NDLs) constitute—and the problems associated with them—can be found in an excellent 2017 paper from the Federal Bank of Chicago, written by Rebecca Lewis and John McPartland. But the key problem that bears discussion is this: While the industry is focused on having to pony up for default losses, as occurred at Nasdaq Clearing, NDLs are potentially even more severe, because in these instances, CCPs won’t have access to the default funds or the waterfall designed to protect the entity during default scenarios. Nor will the CCP have access to more controversial tools, such as variation and initial margin haircuts, that could be employed in extremis during defaults.
As LCH’s chief risk officer, Dennis McLaughlin, told me a few years ago, these losses are “more pernicious, and much more difficult to manage—and they could bring down the CCP far more easily—than a default loss.”
Most CCPs now have sections in their handbooks that address where the burden of responsibility will fall when it comes to NDLs. Some say they will be covered by the CCP, others that FCMs should share in the cost when it comes to stabilizing the clearinghouse, as it’s in everyone’s best interest to do so. Yet it’s still somewhat alarming—particularly in an era where cybersecurity is the top concern for most industries, and finance is routinely listed as the most vulnerable of these—that a fraction of the attention given to well-developed and tested default management processes is applied to how NDLs are handled.
Given the systemic importance of CCPs in the post-crisis trading environment, and their installation as the risk managers of trillion-dollar markets by regulators, there should be more focus on how the costs associated with NDLs are allocated. Deeper questions should then be asked about whether this should stop with the CCPs, or be extended into analysis of the service providers they utilize, particularly as technology provision remains concentrated—in some areas, at least—in a handful of cloud and systems vendors.
This isn’t saying anything revolutionary, but the crux of the matter is that the incident at Nasdaq Clearing proved that the system works, even if it raised a number of pertinent questions about how current rules need to be fine-tuned. The real test, however, likely won’t be whether cover-one or cover-two resiliency is sufficient. It’ll be what happens when the real cyber attack—the big one—finally comes, and how the industry copes with its infrastructure being taken offline.
Bloomberg’s Gerard Francis looks at the challenges that capital markets firms face when trying to incorporate alternative datasets.Subscribe to Weekly Wrap emails
- Sell-Side Technology Awards 2019: All the Winners and Why They Won
- Wavelength Podcast Episode 161: Gerard Francis of Bloomberg
- Sell-Side Technology Awards 2019: Best Data Provider to the Sell Side—FactSet
- Data Standardization Remains Top ESG Roadblock
- Banks and Bourses Increasingly Combine Cloud, AI for Data Projects