Data Quality As Ideal and Driver

Raising data quality is both the purpose of data governance, linkage and discovery changes, and an incentive to improve those functions


Although it may not seem like the main focus of this month's features, data quality is the proverbial "elephant in the room" discussed in our coverage of data governance, data linkages and price discovery in benchmark data management.

The very formation of data governance plans should be done to direct resources and efforts to improving data quality, as Brian Sobolak of Northern Trust relates in "Turning To Governance For Direction." In effect, data governance plans should be coordinated with data quality goals. Data governance should point data consumers "where to find data in the best, most trustable and highest-quality format," adds Roberto Maranca of GE Capital.

Data provenance and data sourcing, as described in this story, are simply other descriptions of a trait that affects data quality. CUSIP Global Services' Scott Preiss says: "Knowing with certainty where data is originated, always being able to have an audit trail and link to the primary source documentation... is a key component of data governance." Thomson Reuters' John Eliseo advocates the idea that a singular data system may still collect data from numerous sources, which the company kept in mind when removing barriers between content systems as part of a remodeling of data sets seven years ago.

Just as data provenance is an important part of data governance, increasing transparency can affect price discovery for benchmark data, as seen in Joanna Wright's feature "Benchmark Upheaval." MarketAxess' Jim Rucker, speaking about concerns with Europe's MiFID/R II regulation, says the issue is the rule introducing "a level of disclosure that would harm the price formation process." As a result, market indices-benchmarks-"are concerned with making sure that the calibration of transparency is appropriate."

The mandate to increase transparency in European fixed income may be "disruptive in the short term," as Charles River Development's Karl Kutschke says, but "will help the industry evolve." Transparency in US fixed income has "not been detrimental to the bond markets here," adds J.R. Rieger of S&P Dow Jones Indices.

Again, in "The Meaning Behind the Data," a story about how data is linked and evaluated through methods such as APIs and semantics, the end goal, however unspoken, is improving the quality of the data. These efforts and methods show that overall, the industry is moving from "parochial and sectoral information to semantically rich and systematically consistent information," said State Street's David Blaszkowsky-meaning, in effect, the industry is getting higher quality information.

As with data governance, data sourcing is also important to linking data, particularly with monitoring who published the data, how the data is identified and how it is produced, as Bloomberg's Matthew Rawlings says. The provenance of data is the first step to building trust in that data, he explains. If high data quality is actually the inspiration for all of this -- data governance plans, data linkages and transparent sourcing -- then these features provide insight on just how organizations can reach that ideal.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact or view our subscription options here:

You are currently unable to copy this content. Please contact to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here