Centralization Moves to the Foreground
Without collection and coordination of data, high quality is difficult to achieve
In April, data quality emerged as the prevailing theme of greatest concern in our coverage. This month, data centralization is in the spotlight. Coordination of data might be a better way to describe this. Certainly, centralizing and coordinating data ought to go hand in hand with raising the quality of data, if either effort is going to be meaningful or successful.
We start with news of the European Securities and Markets Authority's (ESMA) plans to centralize instrument and trading data to-of course-improve data quality, and the Depository Trust & Clearing Corporation's (DTCC) plans to align swaps data to better support analysis of the data. Then, in our interview with Fidelity ActionsXchange executive Will Dolan, he talks about how the corporate actions service brings together multiple data providers' inputs. Centralization is prevalent in these industry developments.
In Europe, ESMA's Trade Repositories Project, planned for the second half of 2016, promises a single access point to data in those repositories, and its Instrument Reference Data project, expected to be complete in 2017, ought to yield a central facility for both instrument and trading data. Such a central repository will also make regulatory compliance easier, observes Sapient's Cian O'Braonain. "Creating a centralized repository gives each of the different regulatory jurisdictions a similar toolset, a similar capability to be able to analyze and aggregate data," he says.
For both Europe and the US, DTCC's Marisol Collazo calls for a global standard for trade identification that will "resonate" for all jurisdictions. Such a standard has to include a definition of data quality across firms, repositories and regulators, and be supported by data aggregation efforts. DTCC's alignment of swaps data is meant to produce better analysis to feed into that data aggregation.
Centralization even turns up in parts of those data quality conversations, reported in "Steering the Drivers of Data Quality," in which Goldman Sachs' Gururaj Krishnan emphasizes that having "exactly one source" for data to be managed is an important point in the firm's data strategy. Alta Strategic consultant Dennis Gonzalez adds that coordinating sources of data and data operations units is important in efforts to deliver data quality.
Looking at another issue, the integration of Canadian trade reporting with its equivalent in all other global markets, a global, coordinated, centralized and aggregated data set would also be the foundation to accomplish that goal, as Collazo says in "Imperfect Harmony." To harmonize regulations for OTC derivatives worldwide, barriers in place precisely to keep data sets separate have to be removed. Two types of data sets-national or regional data sets for market surveillance, and smaller global data subsets for systemic risk oversight, are separated under data protection laws. So, harmonization of the data sets has become a factor in cross-border functions of the markets, as well.
With so many areas of data management touched by the need to centralize data, this effort ought to take on as much importance and urgency as the quest for higher data quality.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Data heads scratch heads over data quality headwinds
Bank and asset manager execs say the pressure is on to build AI tools. They also say getting the data right is crucial, but not everyone appreciates that.
Reddit fills gaping maw left by Twitter in alt data market
The IMD Wrap: In 2021, Reddit was thrust into the spotlight when day traders used the site to squeeze hedge funds. Now, for Intercontinental Exchange, it is the new it-girl of alternative data.
Knowledge graphs, data quality, and reuse form Bloomberg’s AI strategy
Since 2023, Bloomberg has unveiled its internal LLM, BloombergGPT, and added an array of AI-powered tools to the terminal. As banks and asset managers explore generative and agentic AI, what lessons can be learned from a massive tech and data provider?
ICE launches Polymarket tool, Broadridge buys CQG, and more
The Waters Cooler: Deutsche Börse acquires remaining stake in ISS Stoxx, Etrading bids for EU derivatives tape, Lofthouse is out at ASX, and more in this week’s news roundup.
Fidelity expands open-source ambitions as attitudes and key players shift
Waters Wrap: Fidelity Investments is deepening its partnership with Finos, which Anthony says hints at wider changes in the world of tech development.
Data standardization key to unlocking AI’s full potential in private markets
As private markets continue to grow, fund managers are increasingly turning to AI to improve efficiency and free up time for higher-value work. Yet fragmented data remains a major obstacle.
Digital employees have BNY talking a new language
Julie Gerdeman, head of BNY’s data and analytics team, explains how the bank’s new operating model allows for quicker AI experimentation and development.
Can mastering data solve AI’s cognitive dissonance?
The IMD Wrap: Bank execs are still bullish on AI, but recent studies suggest it’s not the panacea they’re making it out to be. Can the two views be rectified?