Datafeeds special report
Click here to download the PDF
The Need for Feeds: More than Just Speed
Barely a decade ago, traders began eschewing traditional consolidated datafeeds in favor of direct feeds from exchanges, in their pursuit of lower latency. The markets were becoming faster, and everyone had to keep pace if they wanted to remain competitive. At first, these latency gains were fairly easy and inexpensive to achieve. But after plucking all the low-hanging fruit, firms found that more significant gains came at a much higher price, and eventually became a pursuit of diminishing returns for many firms, and now some firms are exiting that race rather than keep pouring money at it.
The markets did speed up-but only a small portion of the capital markets overall, meaning that those expensive low-latency infrastructures only served a very limited purpose. And with firms seeking to federate data as widely as possible across their enterprise for use in new areas, such as Big Data analytics, that small amount of low-latency data may not have sufficient uses elsewhere.
In effect, firms are looking to achieve the economies of scale that consolidators offer by centralizing data acquisition and delivery, while also being able to access broader datasets that offer them the ability to investigate and address new business opportunities. "It is increasingly hard for firms to develop and sustain a competitive advantage with speed alone.... Instead, firms differentiate their strategies in other ways, with diverse, high-quality data and analytics," says Brian Cassin, managing director at S&P Capital IQ. "The focus is more on putting together a complex strategy intermingling more data to make better decisions. Consolidated feeds make data consumption easier, offering high performance and bringing diverse content together into one delivery mechanism."
In addition, Alex Tabb, partner at Tabb Group, says firms are looking to eliminate complexity, which translates directly to costs. This means not only reducing the number of standalone, specialist data architectures (for low-latency data or otherwise), but also streamlining the number of relationships that a firm must maintain in order to obtain the data it needs. In this instance, a single consolidator can eliminate the need to work directly with multiple vendors, along with the costs inherent in maintaining those relationships.
In an era of Big Data, chasing every new data input is not an efficient use of firms' time. Firms make money from analyzing that data to create unique trading strategies; not from acquiring data. So, one might argue, leave the trading to the traders, and leave the consolidating to the consolidators.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
CME rankles market data users with licensing changes
The exchange began charging for historically free end-of-day data in 2025, angering some users.
Data heads scratch heads over data quality headwinds
Bank and asset manager execs say the pressure is on to build AI tools. They also say getting the data right is crucial, but not everyone appreciates that.
Reddit fills gaping maw left by Twitter in alt data market
The IMD Wrap: In 2021, Reddit was thrust into the spotlight when day traders used the site to squeeze hedge funds. Now, for Intercontinental Exchange, it is the new it-girl of alternative data.
Knowledge graphs, data quality, and reuse form Bloomberg’s AI strategy
Since 2023, Bloomberg has unveiled its internal LLM, BloombergGPT, and added an array of AI-powered tools to the Terminal. As banks and asset managers explore generative and agentic AI, what lessons can be learned from a massive tech and data provider?
ICE launches Polymarket tool, Broadridge buys CQG, and more
The Waters Cooler: Deutsche Börse acquires remaining stake in ISS Stoxx, Etrading bids for EU derivatives tape, Lofthouse is out at ASX, and more in this week’s news roundup.
Fidelity expands open-source ambitions as attitudes and key players shift
Waters Wrap: Fidelity Investments is deepening its partnership with Finos, which Anthony says hints at wider changes in the world of tech development.
Data standardization key to unlocking AI’s full potential in private markets
As private markets continue to grow, fund managers are increasingly turning to AI to improve efficiency and free up time for higher-value work. Yet fragmented data remains a major obstacle.
Digital employees have BNY talking a new language
Julie Gerdeman, head of BNY’s data and analytics team, explains how the bank’s new operating model allows for quicker AI experimentation and development.