Quest For Data Quality

Stories in this issue of Inside Reference Data cover the different means used to pursue higher data quality, including partnership between service providers and clients, enterprise-strength tools that can bridge gaps in data processes and uses, and improving cooperation between IT and business professionals.
Interactive Data CEO Stephen Daffron tells us that being transparent with clients about the quality controls it uses and the sourcing of its data promotes better understanding of the issues when pricing or other data is incorrect.
Daffron also identifies big data, with its increasing size, granularity and lack of structure, as an opposing force to "cost direction," which is driving costs down by moving reference data sourcing, cleansing and delivery from central systems to cloud computing resources.
Big data, of course, is a broad term, as noted recently in an online-only column "Big Data Terminology". It can refer to issues such as integrating or centralizing data, or technology resources and scalability of data systems. These pieces of big data are really about the pursuit of higher quality data. They are the means used to improve quality, consistency and value.
Enterprise-strength data management tools are available to handle "big data" and connect data governance with other data functions, including quality measurement, access privileging, control and usage, notes CIBC's Peter McGuinness in "Choosing Tools and Setting Models." Just as Daffron points to cost concerns, so do McGuinness and RBC's Patricia Huff in this story. Firms may choose to go in-house if they either cannot make a business case to buy outside providers' tools or if they can more readily build appropriate systems on their own. Anything firms consider buying has to be justified in terms that business executives can understand, McGuinness says.
A question remains-if neither data managers nor business operations managers are likely to walk away satisfied from a negotiation on how to proceed with a data quality effort, as Huff suggests, is it really possible to get meaningful improvement to data quality?
Correcting discrepancies, which Interactive Data approaches with transparency, are also a challenge in the corporate actions space, as Nicholas Hamilton reports. SmartStream's Adam Cottingham says discrepancies can happen due to values varying between sources or additional values turning up in custodians' files. The correct data has to be retrieved from the issuer itself, which can be harder to coordinate with so many parties involved in a corporate action and the ensuing data generated.
And getting the right data together plays a part in complying with regulation, as RBS's David Sharratt reminds us in "Facing Up to the New Regulatory World." The variety of products, asset classes and systems being used in multiple markets by a global firm such as RBS means internal, external and outsourced systems and process all must be managed and marshalled in service to data, Sharratt says. With multiple players all having a stake, even when a choice is made to stick with internal systems, as RBC's Huff related, it becomes evident that pursuing higher quality data can be a tall order that requires widespread support and participation, both among functional units within a firm and in partnership with service providers.
A closing note: With this issue, deputy editor Nicholas Hamilton has completed his work with Inside Reference Data. We will miss him and we wish him well on his next endeavor.
More on Data Management
As datacenter cooling issues rise, FPGAs could help
IMD Wrap: As temperatures are spiking, so too is demand for capacity related to AI applications. Max says FPGAs could help to ease the burden being forced on datacenters.
Bloomberg introduces geopolitical country-of-risk scores to terminal
Through a new partnership with Seerist, terminal users can now access risk data on seven million companies and 245 countries.
A network of Cusip workarounds keeps the retirement industry humming
Restrictive data licenses—the subject of an ongoing antitrust case against Cusip Global Services—are felt keenly in the retirement space, where an amalgam of identifiers meant to ensure licensing compliance create headaches for investment advisers and investors.
LLMs are making alternative datasets ‘fuzzy’
Waters Wrap: While large language models and generative/agentic AI offer an endless amount of opportunity, they are also exposing unforeseen risks and challenges.
Cloud offers promise for execs struggling with legacy tech
Tech execs from the buy side and vendor world are still grappling with how to handle legacy technology and where the cloud should step in.
Bloomberg expands user access to new AI document search tool
An evolution of previous AI-enabled features, the new capability allows users to search terminal content as well as their firm’s proprietary content by asking natural language questions.
CDOs must deliver short-term wins ‘that people give a crap about’
The IMD Wrap: Why bother having a CDO when so many firms replace them so often? Some say CDOs should stop focusing on perfection, and focus instead on immediate deliverables that demonstrate value to the broader business.
BNY standardizes internal controls around data, AI
The bank has rolled out an internal enterprise AI platform, invested in specialized infrastructure, and strengthened data quality over the last year.