Max Bowie: Wherefore Art Thou, Transparency?
The notion of bringing transparency to market data isn’t new. In fact, it has been the impetus for user groups and public forums over the years. However, it is gaining renewed attention in the current market as data providers see the improving economic conditions and seek to raise prices, but end-user firms remain cautious and cost-sensitive, so aren’t really increasing data budgets, and to better manage those costs, are implementing “fairer” cost allocation programs to show business areas exactly what they pay for content and technology.
For example, speakers at Inside Market Data’s recent European Financial Information Summit cited the need for a transparent process around how data is managed, and transparent cost models to allocate costs fairly to business lines in such a way as to make business users aware of the costs they incur—not only for the price of an application or service, but also the cost of shared resources such as networks and hardware required to run and access it, and by how much staff in a particular department use a service compared to others. This not only makes end-users more aware of the costs they incur to the business, but also makes them more proactive about managing costs, speakers said, and reduces frustration with opaque recharges.
On one hand, there’s price transparency (how a source prices its price data): Users frequently lament the lack of any standardized, apples-to-apples pricing for similar datasets between different vendors and exchanges, and how they arrive at the value of their data and translate it into the fees that they charge—something that is also applicable in the case of over-the-counter (OTC) broker data. For example, upstart trading venues traditionally provide market data free of charge to win business before charging fees once they gain a certain market share. However, unlike the model of the consolidated tape, which adjusts each exchange’s share of revenues based on resulting trading activity, exchanges aren’t known for reducing their fees if their market share slips—even though their data is arguably less representative of the market and hence less valuable.
On the other hand, there are the issues of transparency around how providers allow firms to use the data, and more specifically, the lack of any standards or harmony between the terms and policies with which they describe how firms can use it. Speakers at EFIS bemoaned the irony that firms pay for applications and services to support the growth of their business, but that some of these services come with licensing terms that are “revenue-driven, not transparency-driven” and can constrain attempts to grow their business.
It may shock those outside the world of market data that there’s no such thing as a standard contract for essentially the same data service from different markets or vendors—or that there is little regulatory scrutiny of transparency around data costs and policies. Though US exchanges must obtain Securities and Exchange Commission approval for any new services that introduce new fees, the process is generally viewed as a rubber stamp. So it generally falls to end-users to cajole exchanges into some level of harmonization, because they bear the brunt of interpreting and managing a multitude of different contracts. However, relevant examples of industry cooperation exist, such as the FIX Protocol: Instead of each market having a different routing protocol requiring traders to use different interfaces for each exchange, FIX provided a standard that could replace the costs of using and maintaining multiple proprietary protocols. Similarly, standardized contracts and terms could result in lower legal fees from having a standard, industry-adopted template, and emerging markets exchanges being able to offer their data according to terms already familiar to potential clients in new markets, while easier-to-understand contracts would surely reduce the amount of accidental under-reporting.
It may shock those outside the world of market data that there’s no such thing as a standard contract for essentially the same data service from different markets or vendors.
With bodies such as the World Federation of Exchanges (WFE) becoming more active on standards around issues such as cyber security and information protection, perhaps the WFE could also turn its attention to standardization of contracts and policies, reducing the need for end-users or individual exchanges to carry the bulk of the burden.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
BlackRock further integrates Preqin, Nasdaq and Osaka Exchange partner, and more
The Waters Cooler: SGX remodels data lake, ICE seeks tokenization approval, TNS closes Radianz deal, and more.
ICE to seek tokenization approval from SEC under existing federal laws
CEO Jeff Sprecher says the new NYSE tokenization initiative is not dependent on the passage of the US Clarity Act.
Waters Wavelength Ep. 346: TS Imagine’s Andrew Morgan
This week, Andrew Morgan of TS Imagine talks with Wei-Shen about how fixed income trading behavior is changing.
State Street expands in Abu Dhabi, Etrading advances UK bond tape, and more
The Waters Cooler: Avelacom expands access into Argentina’s capital markets, Seven Points Capital opens a London office, and more in this week’s news roundup.
Re-examining Big Tech’s influence over the capital markets
Waters Wrap: A few years ago, it seemed the big cloud providers were positioning themselves to dominate the capital markets tech scene. And then came ChatGPT.
NYSE plans new venue, Levy leaves Symphony, and more
The Waters Cooler: MIAX sells (most of) its derivatives exchange, BNY integrates with Morningstar on collateral, and science delights in this week’s news roundup.
Identity resolution is key to future of tokenization
Firms should think not only about tokenization’s potential but also the underlying infrastructure and identity resolution, writes Cusip Global Services’ Matthew Bastian in this guest column.
Waters Wavelength Ep. 345: Patrick McGarry’s Ride to Remember
Tony speaks with Patrick McGarry, who is riding his bike across America to raise $100,000 for the Tunnel to Towers foundation and to honor his sister, Katie, who was at Waters’ inaugural conference on 9/11.