Data Quality As Ideal and Driver
Raising data quality is both the purpose of data governance, linkage and discovery changes, and an incentive to improve those functions
Although it may not seem like the main focus of this month's features, data quality is the proverbial "elephant in the room" discussed in our coverage of data governance, data linkages and price discovery in benchmark data management.
The very formation of data governance plans should be done to direct resources and efforts to improving data quality, as Brian Sobolak of Northern Trust relates in "Turning To Governance For Direction." In effect, data governance plans should be coordinated with data quality goals. Data governance should point data consumers "where to find data in the best, most trustable and highest-quality format," adds Roberto Maranca of GE Capital.
Data provenance and data sourcing, as described in this story, are simply other descriptions of a trait that affects data quality. CUSIP Global Services' Scott Preiss says: "Knowing with certainty where data is originated, always being able to have an audit trail and link to the primary source documentation... is a key component of data governance." Thomson Reuters' John Eliseo advocates the idea that a singular data system may still collect data from numerous sources, which the company kept in mind when removing barriers between content systems as part of a remodeling of data sets seven years ago.
Just as data provenance is an important part of data governance, increasing transparency can affect price discovery for benchmark data, as seen in Joanna Wright's feature "Benchmark Upheaval." MarketAxess' Jim Rucker, speaking about concerns with Europe's MiFID/R II regulation, says the issue is the rule introducing "a level of disclosure that would harm the price formation process." As a result, market indices-benchmarks-"are concerned with making sure that the calibration of transparency is appropriate."
The mandate to increase transparency in European fixed income may be "disruptive in the short term," as Charles River Development's Karl Kutschke says, but "will help the industry evolve." Transparency in US fixed income has "not been detrimental to the bond markets here," adds J.R. Rieger of S&P Dow Jones Indices.
Again, in "The Meaning Behind the Data," a story about how data is linked and evaluated through methods such as APIs and semantics, the end goal, however unspoken, is improving the quality of the data. These efforts and methods show that overall, the industry is moving from "parochial and sectoral information to semantically rich and systematically consistent information," said State Street's David Blaszkowsky-meaning, in effect, the industry is getting higher quality information.
As with data governance, data sourcing is also important to linking data, particularly with monitoring who published the data, how the data is identified and how it is produced, as Bloomberg's Matthew Rawlings says. The provenance of data is the first step to building trust in that data, he explains. If high data quality is actually the inspiration for all of this -- data governance plans, data linkages and transparent sourcing -- then these features provide insight on just how organizations can reach that ideal.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
DeFi and TradFi firms are borrowing each other’s benefits
The Waters Wrap: As blockchain tech gains a small foothold in market data, Nyela says the thing separating blockchain’s previous craze and its second wind is choice.
Spoiler alert: managing market data is a bad case for AI
The IMD Wrap: A recent conversation between Max and one of his sources highlights the uses of different mechanisms to manage one of their most expensive assets.
LSEG makes final case for dismissal of MayStreet lawsuit
Lawyers for both LSEG and MayStreet founder Patrick Flannery have argued the lawsuit’s merits through various legal filings for almost a year.
A new market data hope or an expanding Empire
Market data is now part of systemic infrastructure rather than just a commercial product. Tim Versteeg questions if market data is becoming too powerful to fail.
The race to ‘financialize’ GPU compute set to ratchet up
The Waters Wrap: Anthony looks at two companies aiming to bring efficiency and transparency to the GPU compute market.
Deutsche Börse invests $200M in Kraken, DTCC advances cloud strategy, and more
A recap of this week’s major tech and data news in the capital markets.
Data industry spend hits $50B for first time in new report
A new product by BCG Expand will track market data vendor size and market share as it seeks to show data users where the market is heading.
TNS integrates Radianz, Exegy reduces latency, BondXN allies with BlackRock, and more
A recap of this week’s major tech and data news in the capital markets.