Datafeeds special report


July 2015 -- sponsored by S&P Capital IQ

Click here to download the PDF

The Need for Feeds: More than Just Speed

Barely a decade ago, traders began eschewing traditional consolidated datafeeds in favor of direct feeds from exchanges, in their pursuit of lower latency. The markets were becoming faster, and everyone had to keep pace if they wanted to remain competitive. At first, these latency gains were fairly easy and inexpensive to achieve. But after plucking all the low-hanging fruit, firms found that more significant gains came at a much higher price, and eventually became a pursuit of diminishing returns for many firms, and now some firms are exiting that race rather than keep pouring money at it.

The markets did speed up-but only a small portion of the capital markets overall, meaning that those expensive low-latency infrastructures only served a very limited purpose. And with firms seeking to federate data as widely as possible across their enterprise for use in new areas, such as Big Data analytics, that small amount of low-latency data may not have sufficient uses elsewhere.

In effect, firms are looking to achieve the economies of scale that consolidators offer by centralizing data acquisition and delivery, while also being able to access broader datasets that offer them the ability to investigate and address new business opportunities. "It is increasingly hard for firms to develop and sustain a competitive advantage with speed alone.... Instead, firms differentiate their strategies in other ways, with diverse, high-quality data and analytics," says Brian Cassin, managing director at S&P Capital IQ. "The focus is more on putting together a complex strategy intermingling more data to make better decisions. Consolidated feeds make data consumption easier, offering high performance and bringing diverse content together into one delivery mechanism."

In addition, Alex Tabb, partner at Tabb Group, says firms are looking to eliminate complexity, which translates directly to costs. This means not only reducing the number of standalone, specialist data architectures (for low-latency data or otherwise), but also streamlining the number of relationships that a firm must maintain in order to obtain the data it needs. In this instance, a single consolidator can eliminate the need to work directly with multiple vendors, along with the costs inherent in maintaining those relationships.

In an era of Big Data, chasing every new data input is not an efficient use of firms' time. Firms make money from analyzing that data to create unique trading strategies; not from acquiring data. So, one might argue, leave the trading to the traders, and leave the consolidating to the consolidators.

Click here to download the PDF

  • LinkedIn  
  • Save this article
  • Print this page  

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an indvidual account here: