Greater competition among benchmark data providers, along with a willingness to look at other industries' data management practices, could spur innovation in financial reference data management
Benchmark and index data services are becoming much more competitive. This development is most evident in the reaction to Vanguard Group dropping MSCI as its provider due to high costs.
Research by consultancy Cutter Associates surveying its 200 member firms found that before this move by Vanguard, announced on October 2, firms would grumble about the costs of benchmark data, but would just suck it up and keep paying whatever was asked. Vanguard’s action could give the industry the backbone to demand more competitive pricing for benchmark data providers’ services.
In the providers’ defense, however, the complexity of benchmark and index data appears to have vastly increased over what it once was—and will continue to do so, as reported in an account of an exclusive WatersTechnology industry discussion. Data users are demanding rationalization of benchmark data, meaning checking what the benchmarks themselves are being based on and checking that benchmarks are operated within the correct risk limits. The proliferating numbers of securities that can be contained within a benchmark—as many as 13,000, as reported in our account—means processing benchmark data is also becoming more complex, as is correctly populating index data in users’ data management systems. To top this off, users want all this more sophisticated data daily or even intra-day.
In effect, firms—especially the buy side and fixed-income management firms that took part in the WatersTechnology discussion—have thrown down the gauntlet to data providers. Benchmark and index data providers will have to figure out how to provide much higher data quality, much faster, and for the same or lower costs.
The competitiveness of the benchmark data field just got exponentially greater. It will become harder for MSCI to rest on its laurels and simply call itself “the gold standard” if its users follow Vanguard’s lead and start defecting to competitors that provide data of equal or greater accuracy and quality, at lower cost.
With that in mind, it never hurts for the financial services industry to look to practices in other industries that could hold useful lessons. The ways social media data can be mined to answer questions in a range of fields could be applied to reference data management, as was related at the European Financial Information Summit earlier this fall. This month, we take a direct look at how the retail, telecommunications and energy industries perform data management and how their approaches can be models for data standardization and avoiding separate data silos.
The common theme in benchmark data’s growth and looking to other industries and fields for data management ideas is that the challenges of greater complexity and competition ought to spur more reference data innovations. I invite your thoughts on this at [email protected] or via our LinkedIn discussion group.
Anthony and James delve into how the systematic internalizer regime is shaping up, and then examine the regtech sector.Subscribe to Weekly Wrap emails
- Waters Rankings 2017: All the Winners & Why They Won
- Waters Wavelength Podcast Episode 83: Systematic Internalizers & RegTech
- Former NLX Chief Takes Interim CEO Job at UK Fintech Body
- Mifid Gears Grind into Motion as Systematic Internalizers Emerge
- Power to the People: Will MiFID II Data Disaggregation Deliver on Cost Control Promises?