Financial InterGroup president Allan Grody presents a case for creating a common reference data utility to better cope with the costs of processing legal entity identifier information
Following its late September workshop in Basel, Switzerland to discuss a global identification system being considered to address systemic risk issues, the G-20 Financial Stability Board on October 15 called for "a global legal entity identifier system which uniquely identifies parties to financial transactions with an appropriate governance structure representing public interest." The US Treasury's Office of Financial Research (OFR), chartered under the Dodd-Frank Act, has already been active in this arena, issuing plans for legal entity identifier (LEI) standards.
US agencies, International Organization of Securities Commissions (Iosco) and the Bank for International Settlements (BIS) are seeing a global LEI system as a pre-requisite to analyzing systemic risk that will allow regulators to better see what they need to judge or prevent events that would bring another financial crisis. To do this, it is understood they need data transparency and an ability to aggregate data consistently.
Without unique, unambiguous and universal computer-usable identifiers, the global financial industry inevitably gets multiple versions of identical identifying information. The impact is predictable—transactions that need to match for payment and settlement, and transactions conducted by the same counterparty in the same products that need to be aggregated into positions for risk analysis, do not match, nor do they get aggregated properly. Because systemically important financial institutions are global and transcend sovereign governments' reach, local regulators' rules and even regional compacts, regulatory oversight is neither timely nor comprehensive.
We now realize we have no way of "seeing" the same counterparty's risk exposure across the different financial firms with which it obtains loans from, enters into swaps contracts with, or sets risk exposure limits for. In the US, the OFR is empowered to standardize the types and formats of data to be collected from financial firms. The data being requested will find its way into a newly created data center overseen and perhaps run by the OFR. The data will contain an unprecedented level of granular information, including information on positions, transactions, valuation methods, cashflows and identities of counterparties. This level of granularity is required to make the necessary calculations for analyzing systemic risk. This data had previously been only available periodically to on-site examiners of individual financial institutions.
Such granular and comprehensive data has never been requested or concentrated within one government financial oversight agency, much less at comparable scale and frequency. A myriad of global economic, market and company-specific data will also have to be sourced from hundreds of data vendors and government sources. Policies for computing systemic risk exposures will need to be set. For example, policies will need to be developed on the tolerances for the amount of systemic risk that should be allowed. Dynamic scenarios must be stress tested against the collected data for catastrophic events associated with everything imaginable, from oil spills to weather to war. Volatility, liquidity, capital and leverage gauges must be calibrated and also stress-tested around these scenarios. The OFR will need a variety of analytical tools, yet to be developed, to sift through unprecedented quantities of data pouring in from financial institutions.
What is important to the financial industry is that the division of labor between government and industry is appropriately set in this endeavor. Here, it is obvious that regulators can make a significant contribution by mandating a global standard. They can also create their own tools for analyzing systemic risk. However, given that the government has expressed its belief that much costs and risk will be removed from the infrastructure of financial institutions because of this global standard, it is up to the industry to leverage such global regulatory compulsion for its own benefit, if the claim made by government is true.
No government can extract the cost savings and reduce risk in financial institutions promised by the global identification standard. Only those financial institutions affected—the largest, globally active ones, the "too big to fail" ones, the systemically important ones—systemically important financial institutions (Sifis) as they are called in the Dodd-Frank Act—can do this. Where are these firms in stepping up to claim the benefit that the government is challenging them is there? Is it there? I and my colleagues have estimated that there is $1 billion annually to be saved by each financial institution. How? By sharing the costs and mutualizing the risk in a common reference data utility, a central counterparty for data management.
The idea expressed by some that a common standard will eliminate silos of duplicated reference data, thus saving costs, while valid to a point, does not hold up to the reality of the impediments to enterprise solutions being foisted on silo governance structures. Sounds like it should work but it doesn't.
Without creating a common utility shared by all the largest financial institutions, they will be spending more, not less. They will be paying a government assessment to operate the US government's data center, and perhaps multiple governments' data centers. They will still pay to source data from multiple vendors who will continue to source information from paper documents (with the obvious inherent risk of human error). Financial institutions will still have different end-of-day valuations for the same product, and they will only know about errors in reference data when they try to pay for what they bought and it fails to settle properly.
They will be better able to aggregate data across multiple business silos. However, each business silo will adopt its own rate of transition. Taken together with reconciliation costs between legacy numbering conventions and the global standard, costs will increase over the foreseeable future.
The alternative of a common industry platform for reference data, distributed over an internet-like infrastructure, can achieve significant cost savings over the short term for each firm and a dramatic lowering of costs for the entire industry. After all, the reference data platforms in each firm represent duplicated costs. Install a common utility platform and the savings are obvious. Remember when each firm had its own vault before central securities depositories were created? Only the biggest ones survived, and they're now used as restaurants.
Allan Grody is president of Financial InterGroup Advisors, which has submitted proposals for implementing a global identification system as outlined in this analysis to the OFR, Securities and Exchange Commission and the Commodity Futures Trading Commission.
More from Inside Reference Data
Updating your subscription status
2013 AWARD WINNERS ANNOUNCED
The winners of the Inside Market Data Awards and Inside Reference Data Awards were announced in New York on May 21. The 11th annual awards featured 32 categories covering the very best in market data, reference data and data management.
19 Jun 2013
27 Jun 2013
10 Jul 2013
10 Sep 2013
Complex, dated and unwieldy data infrastructure is not uncommon among even the most progressive companies in the world of finance. As financial regulations...
With the launch of a new legal entity identifier (LEI) looming, the financial services industry needs to get ready to ensure efficient and timely implementation...