Max Bowie: When Competition Fails, Data Quality and Cost Suffer
As one of the largest single costs for any trading firm—behind staff and real estate—market data, and the price that data sources charge for it, is constantly under scrutiny with competition between vendors keeping data quality high and costs low, or at least reasonable.
One of the most important areas is pricing of over-the-counter (OTC) instruments, where no reference exchange price exists, and firms have traditionally depended on trade prices from brokers, evaluated prices derived from various inputs on a daily or intra-day basis—but not real time—or services that provide prices based on contributions from other dealers or buy-side firms.
But contributed prices can be manipulated, as evidenced by the UK’s Financial Services Authority (FSA) and US Commodity Futures Trading Commission’s (CFTC’s) investigation into Barclays’ misconduct over Libor rate pricing, culminating in massive fines and the resignations of the bank’s chief executive and chief operating officer.
Tell-Tale Signs
It is possible to spot tell-tale signs of price manipulation: Vancouver, BC-based derivatives valuation and risk management software vendor Fincad spotted a divergence between Libor-based swap pricing and OIS-based pricing that began around the start of the financial crisis and continued to result in Libor swaps being priced too low and not accounting for embedded risk, officials say, prompting the vendor to incorporate an overnight index swap-based pricing curve into its Insight Solutions derivatives valuation and hedge accounting platform.
But properly maintained benchmarks are an important tool in over-the-counter markets, and will become increasingly so for asset classes that regulators cannot shoehorn onto exchange-like venues, where participants will start to demand similar levels of transparency to other marketplaces—for example, Tullett Prebon Information’s recently launched benchmark oil curves to provide transparency into commodity derivatives.
Once the authorities have pored over Libor, it should emerge more transparent and stronger than others that have not been subject to the same scrutiny. In the meantime, consumers may resent paying for data that is now being called into question. Of course, consumers usually resent data fees—especially for “public” benchmark or index data—and at a time when budgets aren’t increasing in line with the costs of data services, forcing unpopular cuts, that frustration is understandable, though the old argument that “market data fees are like charging to look at the price list,” no longer reflects the extent to which content has broadened, especially in the OTC markets.
But there are signs that data sources are softening their stance and being more proactive in introducing policies that costs cannot simply continue to increase. One thing that especially irks users are constraints on how data can be used, such as non-display fees or derived data fees—both of which are addressed in a new Global Data License Agreement covering data from all of NYSE Euronext’s marketplaces, which will be rolled out next year, and which is designed to simplify policies and reduce costs for end-users. This license agreement—modeled in part on one rolled out by the London Stock Exchange last year—has won praise from user groups for proposing more flexible policies, as well as for the process by which NYSE has consulted with end-users along the way—instead of simply announcing the changes with a 30- or 90-day notice period to make the changes less painful.
Adoption
With luck, the adoption of these policies will lead to more competition between data sources, forcing them to use cost and quality as differentiators. Firms seem willing to pay more for guaranteed quality, so long as there are opportunities to reduce spend in other areas. By addressing quality issues and easing cost pressures, end-users can turn their attention to growth—and to purchasing more data to support new business—rather than focusing on cuts. Only then can the data industry once again be a driver of growth.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Breaking out of the cells: banks’ long goodbye to spreadsheets
Dealers are cutting back on their use of Excel amid tighter regulation and risk concerns.
BMO’s cloud migration strategy eases AI adoption
The Canadian bank is embracing a more digital future as its cloud strategy makes gains and it looks to both traditional machine learning and generative AI for further augmentation.
Waters Wrap: GenAI and rising tides
As banks, asset managers, and vendors ratchet up generative AI experiments and rollouts, Anthony explains why collaboration between business and tech teams is crucial.
People Moves: NorQuant, Tradition, Duco, HKEx, SimCorp, Hazeltree, Xceptor, Broadridge, and more
A look at the past month’s people moves in the capital markets technology and data space.
Northern Trust building internal cloud data ‘marketplace’
Using a mix of in-house expertise and third-party technologies, the firm has constructed a cloud-based data mesh that gives internal staff access to proprietary datasets and analytical tools to deliver greater insights into client activity.
FactSet looks to build on portfolio commentary with AI
Its new solution will allow users to write attribution summaries more quickly and adds to its goal of further accelerating discoverability, automation, and innovation.
How Ally found the key to GenAI at the bottom of a teacup
Risk-and-tech chemistry—plus Microsoft’s flexibility—has seen the US lender leap from experiments to execution.
The IMD Wrap: Beginning of the end for data audits?
This week, there’s exciting news for data bean-counters in the form of a partnership between two vendors that could change the way we view and track data usage and audits.
Most read
- Chris Edmonds takes the reins at ICE Fixed Income and Data Services
- Northern Trust: Improving transparency across the asset servicing market
- Waters Wrap: GenAI and rising tides