• inside_market_data
  • inside_reference_data
  • buy_side_technology
  • sell_side_technology
Max Bowie, editor, Inside Market Data

Opening Cross: The Longevity of Latency as Smarts Outpaces Speed

  • Send
  • Comment
  • Send to Kindle

Though data latency attracts a lot of attention because of its necessity for algorithmic—not just high-frequency—trading, it isn’t the only game in town. And because it is bounded by physical limits—i.e. the speed of light, or whatever is faster than light, for when we find a way to transmit data by some other means—it has a limited shelf life for delivering competitive advantage, compared to inputs that might yield more value, long-term.

Meantime, the low-latency marketplace continues to grow—by 1.5 percent in 2012 and 4.5 percent over the next three years, according to Tabb Group, which places current sell-side spend on data distribution technologies at $3.6 billion.

And beyond the most liquid, exchange-traded asset classes already straining the limits of latency, the over-the-counter markets have a long way to go before they exhaust the potential for latency reduction. But firms are already applying low-latency technologies in these markets, and will surely expand them to “low-frequency” asset classes as the dynamics of those instruments change due to shifts toward centrally-cleared venues and as investors seek assets with higher potential returns.

This could prompt institutional traders to desert unprofitable equity markets completely for OTC assets, contributing to the rapid evolution of those markets and increased data demand, but having the reverse effect on exchanges, which would need to leverage other business models to maintain revenues, such as increasing their focus on derivatives trading, clearing and—as BT’s Chris Pickles suggests in this issue’s Open Platform—being a neutral “messaging hub” between markets and participants.

This would free up equity markets to fulfill what some argue is their true role—enabling companies to raise capital, rather than being barometers of short-term volatility—and increase their appeal to long-term investors concerned about being outpaced by high-frequency traders.

With a different makeup of participants, exchanges may also have to provide more data free of charge for lower-end investors—not an appealing prospect, as data revenues grew in Q1 while overall exchange revenues fell. However, they could offset any losses by leveraging their central position as aggregators of liquidity and information to capture more data, translate that into new types of datasets and signals, and charge a premium for it. Demand is growing for exchange-like data on OTC asset classes, such as the Datavision Streaming tick-by-tick data service for OTC credit instruments launched last week by credit specialist CMA based on prices from market participants, or Benchmark Solutions’ streaming market-driven pricing, or even transaction cost analysis for markets like currencies—such as that launched last week by agency broker ITG—which could provide an additional input for decision support.

And factors currently used to assess risk could be applied to create new trading indicators. For example, risk and portfolio analysis tools provider Axioma last week presented its quarterly risk review, revealing lower risk and volatility levels across global markets—except China—in Q1 than in the previous quarter.

One way to reduce risk is to diversify by minimizing correlation, since a “diverse” portfolio of stocks that behave similarly is not really diverse at all, while futures prices may not accurately reflect underlyings because of factors priced into the future—for example, oil futures include the cost of transportation—so Axioma creates synthetic prices based on other factors affecting an asset that more accurately reflect its value. Though designed to support long-term decisions, rather than tracking intraday price movements, why couldn’t these models be used to create real-time synthetic prices that expose market inefficiencies in future?

So, will low latency become less important over time? No, because it becomes the benchmark rather than the cutting edge, and because all the high-performance technology providers will be crucial to calculating valuable new data inputs in a timely manner to meet that benchmark.

  • Send
  • Comment
  • Send to Kindle

More from Inside Market Data

Sign-up for WatersTechnology email newsletters

Register for regular alerts to receive up-to-date news directly into your inbox

Related Articles

Latest Media


Visitor comments Add your comment


Add your comment

We won't publish your address

By submitting a comment you agree to abide by our Terms & Conditions Your comment will be moderated before publication

Submit your comment

Winner's Announced: Inside Market Data Awards 2014

View the winners...

The winners of the 12th annual Inside Market Data Awards 2014 and Inside Reference Data Awards 2014 were announced in New York on May 21, recognizing industry excellence within market data and reference data. To view the winners across the 31 categories click here.




Information currently unavailable.

Latest Whitepapers


Tackling Teething Troubles: Examining the Current State of the OTC Derivatives Market

The over-the-counter (OTC) derivatives market is in the midst of a global regulatory restructure. Authorities in Europe, Asia and the US are currently...


A data-centric approach to portfolio management

A fast, flexible and reliable investment decision-making process must be based on access to accurate and consistent information throughout an organization....