Seeking the Path to Data Quality

Inside Reference Data regularly covers ongoing developments in standards and regulation. This month, among other areas, we delve into data quality—the reason for and end goal of all the standards being developed and debated.
Nicholas Hamilton, who has joined Inside Reference Data as a reporter based in London, speaks with Llew Nagle, head of consumer service management for reference data at Deutsche Bank. Nagle tells us that with ISO 15022, ISO 20022 and XBRL competing for acceptance, the industry cannot see standards as a catch-all solution for data communication issues. More descriptive codes, including country codes and currency codes, are necessary, he says.
As Bill Meenaghan, a product manager at Omgeo, relates in this month’s Industry Warehouse, data accuracy—in this case with settlement instructions—can increase a firm’s ability to comply with market standards and best practices. This shows a direct through-line from data quality to regulatory compliance, and thus, possibly, to more reliable and trustworthy markets.
This may be easier said than done, of course. Data quality can also depend on compatibility, as Adam Honoré, research director of the institutional securities practice at Aite Group, pointed out in a recent conference call. “Everybody has their own keys. Some are using Cusips or ISINs. Some have proprietary keys such as RIC or BIC codes,” he says. “I don’t think there are incentives from the key data suppliers to have a consolidated solution. It’s a tough road.”
While at Sibos, I heard from Meenaghan’s colleague, Tony Freeman, director of industry relations at Omgeo. As recounted in an online “Editor’s View” following the conference, Freeman wonders about the financial industry’s inability to arrive at one set of standards as other industries do. Those managing data are contending with how to identify it and where it all goes. With the various code types Honoré cites floating around and the messaging standards competing for acceptance as Nagle sees, it will be a minor miracle if anything gets done at all.
If universal compatibility is key to achieving quality data, accuracy may only be half the battle. Enterprise data management (EDM) is said to be a more accurate and faster means of managing data than mature data strategies still in use at many firms, as described in the special report attached to this issue. The EDM Council, which pursues identifier standards, classification schemes and contractual definitions for financial securities, went to Basel last month to lobby the G20 to accept the US definition of the legal entity identifier. If markets worldwide can agree on this identification standard, can they follow the same path to data quality?
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Growing pains: Why good data and fortitude are crucial for banks’ tech projects
The IMD Wrap: Max examines recent WatersTechnology deep dives into long-term technology projects at several firms and the role data plays in those efforts.
Investing in the invisible, ING plots a tech renaissance
Voice of the CTO: Less than a year in the job, Daniele Tonella delves into ING’s global data platform, gives his thoughts on the future of Agile development, and talks about the importance of “invisible controls” for tech development.
Optiver relies on BMLL market data for quant strategy
The market-maker has built its trading business on top of BMLL’s Level 3 data. But the collaboration is young, and the pair have grand plans to make options the next quant frontier.
Bloomberg expands IBVAL; the SIPs and 24/5 trading; Broadridge’s agentic play, and more
The Waters Cooler: State Street embraces interop, Citi’s CIO outlines the XiNG risk platform, power companies explore alternative nuclear supply options to datacenters, and more.
As costs rise, buy-side CIOs urge caution on AI
Conference attendees encouraged asset managers to tread carefully when looking to deploy AI-driven solutions, citing high cost pressures.
XiNG: Inside Citi’s all-encompassing risk platform
Voice of the CTO: Citi’s chief information officer, Jon Lofthouse, explains how and why the bank has extended its enterprise-wide risk platform so that every trade in any asset class goes through it.
Demand for private markets data turns users into providers
Buy-side firms seeking standardized, user-friendly datasets are turning toward a new section of the alternatives market to get their fix—each other.
LSEG-AWS extend partnership, Deutsche Bank’s AI plans, GenAI (and regular AI) concerns, and more
The Waters Cooler: Nasdaq and MTFs bicker about data fees, Craig Donohue to take the reins at Cboe, and Clearwater closes its Beacon deal, in this week’s news roundup.