July 2015 -- sponsored by S&P Capital IQ
The Need for Feeds: More than Just Speed
Barely a decade ago, traders began eschewing traditional consolidated datafeeds in favor of direct feeds from exchanges, in their pursuit of lower latency. The markets were becoming faster, and everyone had to keep pace if they wanted to remain competitive. At first, these latency gains were fairly easy and inexpensive to achieve. But after plucking all the low-hanging fruit, firms found that more significant gains came at a much higher price, and eventually became a pursuit of diminishing returns for many firms, and now some firms are exiting that race rather than keep pouring money at it.
The markets did speed up-but only a small portion of the capital markets overall, meaning that those expensive low-latency infrastructures only served a very limited purpose. And with firms seeking to federate data as widely as possible across their enterprise for use in new areas, such as Big Data analytics, that small amount of low-latency data may not have sufficient uses elsewhere.
In effect, firms are looking to achieve the economies of scale that consolidators offer by centralizing data acquisition and delivery, while also being able to access broader datasets that offer them the ability to investigate and address new business opportunities. "It is increasingly hard for firms to develop and sustain a competitive advantage with speed alone.... Instead, firms differentiate their strategies in other ways, with diverse, high-quality data and analytics," says Brian Cassin, managing director at S&P Capital IQ. "The focus is more on putting together a complex strategy intermingling more data to make better decisions. Consolidated feeds make data consumption easier, offering high performance and bringing diverse content together into one delivery mechanism."
In addition, Alex Tabb, partner at Tabb Group, says firms are looking to eliminate complexity, which translates directly to costs. This means not only reducing the number of standalone, specialist data architectures (for low-latency data or otherwise), but also streamlining the number of relationships that a firm must maintain in order to obtain the data it needs. In this instance, a single consolidator can eliminate the need to work directly with multiple vendors, along with the costs inherent in maintaining those relationships.
In an era of Big Data, chasing every new data input is not an efficient use of firms' time. Firms make money from analyzing that data to create unique trading strategies; not from acquiring data. So, one might argue, leave the trading to the traders, and leave the consolidating to the consolidators.
Anthony and James talk AI and ESG, Reg SCI and the SEC, and Game of Thrones and Dragons.Subscribe to Weekly Wrap emails