As in life, the data industry is full of contradictions. We want more data but want to pay less; we want every tick but don’t want to have to handle massive volumes; we insist on the fastest feeds, whether or not our infrastructures are fast enough to exploit them; and we complain about vendor lock-in, but often balk at the cost of supporting “open” standards.
On one hand, budgets at end-user firms are still under intense pressure, with firms reducing costs across the board, from spend on content and vendor services to data professionals themselves. Yet on the other hand, industry spend on market data actually rose by more than six percent last year to more than $25 billion, according to research from Burton-Taylor International Consulting—although the increase can partly be attributed to the stronger dollar and price rises among vendors and exchanges.
When it comes to price rises, nobody likes to admit they are doing it—at least not without good cause, especially in dire economic times—but everyone loves to boast when these hikes translate to higher revenues and margins. For example, though Nasdaq OMX’s fourth-quarter revenues fell from Q3, the exchange was still up year-on-year, and market data revenues—which contributed more than 21 percent of overall revenues in Q4—rose by five percent quarter-on-quarter and 10 percent year-on-year to $87 million.
One area where firms have been willing to spend in recent years—in contrast to their reluctance in other areas—is reducing latency for highly profitable automated trading strategies, and on supporting technologies, from low-latency data fabrics to latency measurement services, such as TS-Associates’ “ultra-high bandwidth” TipOff appliances, which the vendor released last week. But at the same time, research from GreySpark Partners suggests that data latency is no longer the biggest problem, and that firms’ own decision-making systems now contribute twice as much latency as the entire external and internal roundtrip data cycle.
Meanwhile, research and consulting firm Tabb Group last week declared that the latency of industry-wide consolidated tape feeds—already displaced by low-latency direct exchange feeds for trading purposes—is no longer acceptable to satisfy best execution requirements. Yet consolidated tape utilities the Options Price Reporting Authority and the Nasdaq-administered UTP Operating Committee are both investing in bandwidth increases to support delivery of more data to clients—with OPRA volumes set to double this year.
And size isn’t just an issue for datafeeds, but also for the marketplaces that generate them: Whereas in recent years, we’ve seen a wave of mega-mergers among exchanges (as well as vendors and banks), we are now starting to see such deals scuppered by regulators, such as Deutsche Börse’s ill-fated proposed takeover of NYSE Euronext, which was blocked last week by the European Commission, which believed the deal would reduce competition and result in a “quasi-monopoly” in some asset classes, resulting in the exchanges calling off merger discussions.
But big as these deals are—and the financial penalties when they fall apart—perhaps the biggest contradiction is the infamously insular data giant Bloomberg announcing plans to freely license its data API (BLPAPI) last week, as part of its broader Open Market Data Initiative, following in the footsteps of organizations such as Collaborative Software Initiative and its MDAL Market Data Abstraction Layer, and NYSE Technologies, which recently open-sourced its MAMA middleware (IMD, Oct. 31, 2011). I’m not sure whether this move constitutes a true open-sourcing of the API, or how broad the potential applications could be. But since Bloomberg wasn’t able to respond to our questions about the move by press time, we’d like to hear what you think. So call, write or tweet, and tell us how you would use an open-source BLPAPI.
Should regulators take a more active role when it comes to AI oversight, or leave it to the professionals? What will M&A look like in 2018?Subscribe to Weekly Wrap emails