Though data latency attracts a lot of attention because of its necessity for algorithmic—not just high-frequency—trading, it isn’t the only game in town. And because it is bounded by physical limits—i.e. the speed of light, or whatever is faster than light, for when we find a way to transmit data by some other means—it has a limited shelf life for delivering competitive advantage, compared to inputs that might yield more value, long-term.
Meantime, the low-latency marketplace continues to grow—by 1.5 percent in 2012 and 4.5 percent over the next three years, according to Tabb Group, which places current sell-side spend on data distribution technologies at $3.6 billion.
And beyond the most liquid, exchange-traded asset classes already straining the limits of latency, the over-the-counter markets have a long way to go before they exhaust the potential for latency reduction. But firms are already applying low-latency technologies in these markets, and will surely expand them to “low-frequency” asset classes as the dynamics of those instruments change due to shifts toward centrally-cleared venues and as investors seek assets with higher potential returns.
This could prompt institutional traders to desert unprofitable equity markets completely for OTC assets, contributing to the rapid evolution of those markets and increased data demand, but having the reverse effect on exchanges, which would need to leverage other business models to maintain revenues, such as increasing their focus on derivatives trading, clearing and—as BT’s Chris Pickles suggests in this issue’s Open Platform—being a neutral “messaging hub” between markets and participants.
This would free up equity markets to fulfill what some argue is their true role—enabling companies to raise capital, rather than being barometers of short-term volatility—and increase their appeal to long-term investors concerned about being outpaced by high-frequency traders.
With a different makeup of participants, exchanges may also have to provide more data free of charge for lower-end investors—not an appealing prospect, as data revenues grew in Q1 while overall exchange revenues fell. However, they could offset any losses by leveraging their central position as aggregators of liquidity and information to capture more data, translate that into new types of datasets and signals, and charge a premium for it. Demand is growing for exchange-like data on OTC asset classes, such as the Datavision Streaming tick-by-tick data service for OTC credit instruments launched last week by credit specialist CMA based on prices from market participants, or Benchmark Solutions’ streaming market-driven pricing, or even transaction cost analysis for markets like currencies—such as that launched last week by agency broker ITG—which could provide an additional input for decision support.
And factors currently used to assess risk could be applied to create new trading indicators. For example, risk and portfolio analysis tools provider Axioma last week presented its quarterly risk review, revealing lower risk and volatility levels across global markets—except China—in Q1 than in the previous quarter.
One way to reduce risk is to diversify by minimizing correlation, since a “diverse” portfolio of stocks that behave similarly is not really diverse at all, while futures prices may not accurately reflect underlyings because of factors priced into the future—for example, oil futures include the cost of transportation—so Axioma creates synthetic prices based on other factors affecting an asset that more accurately reflect its value. Though designed to support long-term decisions, rather than tracking intraday price movements, why couldn’t these models be used to create real-time synthetic prices that expose market inefficiencies in future?
So, will low latency become less important over time? No, because it becomes the benchmark rather than the cutting edge, and because all the high-performance technology providers will be crucial to calculating valuable new data inputs in a timely manner to meet that benchmark.
More from Inside Market Data
Updating your subscription status
Winner's Announced: Inside Market Data Awards 2015
02 Dec 2015
07 Dec 2015
18 Feb 2016
05 Apr 2016
24 May 2016
Download whitepaper for FREE Solving issues relating to data quality and timeliness should be the top priority for an industry-led reference data utility,...
Download whitepaper for FREE Public cloud has grown exponentially in recent years in sectors such as telecoms, the media, health industry and insurance,...