Optimal Integration: Adding Reference Data Utilities to Your Data Management Workflow
Adam Cottingham outlines the key steps to integrating a reference data utility to optimize your data management strategy

Financial firms are under pressure to balance operational costs with a tighter business yield as the market environment continues to become more conservative. Increasingly, data management is being seen as an area suitable for innovation. By improving current reference data systems and processes through the use of mutualized data processing, it is possible to reduce inefficiencies, cut costs and optimize operations. But how should firms approach using a reference data utility, and how can such a service be best integrated into current systems?
Creating an enterprise-wide data governance strategy is a critical business activity and a solid first step towards gaining access to quality reference data. Many firms need to find ways to reduce their reliance on costly proprietary infrastructure that does not address the data problems currently plaguing the industry, such as the growing need for accurate and timely data to support business and trading execution, regulatory compliance and straight-through processing through the trade lifecycle. Today, firms are seeking an industry-wide version of the truth, and reference data utilities can provide this much-needed access.
Generate ‘Mutualization’
By offering a more cost-effective, less resource-intensive way to achieve this, reference data utilities are fast becoming crucial to streamlining internal data management systems and processes. Utilities provide access to data processing that has been pushed upstream to apply proactive validated and enrichment routines based on input from the industry. The utility uses client feedback, usage patterns and research to achieve agreed levels of quality, timeliness and integrity before packaging and distributing data for consumption. Since firms currently collect and manage these same datasets individually, a reference data utility eliminates this duplication of effort by providing all users with access to an industry version of the truth before the data impacts downstream processing. This “mutualization” not only cuts costs, but also improves data quality and timeliness for all users. Importantly, for those who wish to maintain control, the data and access can be tailored to meet firms’ specific operational and infrastructural needs, incorporating any necessary regulatory obligations.
A utility enables consumers to cooperate and set standards for data validation checks related to vendor variables, instrument specifications and asset class-specific standards. Enrichment routines are commonly established to supplement data related to product-specific and issue-level attributes, along with rule-based derivations for dynamic variables. This allows firms to prevent poor-quality data from polluting the transaction cycle. It uses a positive network effect of accurate data to give firms access to high-quality, usable data that represents an industry truth.
Creating an enterprise-wide data governance strategy is a critical business activity and a solid first step towards gaining access to quality reference data.
Create Business Buy-in
The main—and often toughest—hurdle to integrating reference data utility services into firms’ current data management systems is typically gaining buy-in from management. This involves deciding on the strategy to pursue the best method of execution, and finding an efficient mechanism to disseminate and integrate changes to the data environment. The data consumer is the key stakeholder in this and primarily needs to see the market in a timely way with accurate and complete data. Data management professionals must communicate the benefits of optimizing current reference data infrastructure with a utility that can meet data consumers’ needs.
Enable Client Control
While many firms want to address reference data issues and inconsistencies in the market and reduce internal legacy system costs, losing control remains a major concern. Firms have specific needs and require different levels of service,, which utilities can take into account. A firm might want access to a pipeline of data, or it could be searching for a way to completely outsource its systems. Alternatively, it might prefer to apply the utility’s standards and integrate them into current systems. This allows clients to maintain control of their data management strategy by adopting tailored packages that evolve with market changes and regulations through the use of custom service-level agreements. These need to be results-based to measure the success of the initiative, rather than simply stating the inputs, as typically happens with traditional managed service.
For reference data, the ability to access an agreed, industry-wide version of the truth is an attractive proposition. Traditional sourcing methods are incomplete and out of sync with thow the market behaves within the transactional process. By moving data sourcing and management upstream to a utility, financial firms can reduce the costs and mitigate the risks arising from running proprietary processes on internal legacy systems. Reference data utilities provide a cost-effective and robust, solution that also allows firms to remain in control and to be flexible in response to changing market conditions.
More on Data Management
As datacenter cooling issues rise, FPGAs could help
IMD Wrap: As temperatures are spiking, so too is demand for capacity related to AI applications. Max says FPGAs could help to ease the burden being forced on datacenters.
Bloomberg introduces geopolitical country-of-risk scores to terminal
Through a new partnership with Seerist, terminal users can now access risk data on seven million companies and 245 countries.
A network of Cusip workarounds keeps the retirement industry humming
Restrictive data licenses—the subject of an ongoing antitrust case against Cusip Global Services—are felt keenly in the retirement space, where an amalgam of identifiers meant to ensure licensing compliance create headaches for investment advisers and investors.
LLMs are making alternative datasets ‘fuzzy’
Waters Wrap: While large language models and generative/agentic AI offer an endless amount of opportunity, they are also exposing unforeseen risks and challenges.
Cloud offers promise for execs struggling with legacy tech
Tech execs from the buy side and vendor world are still grappling with how to handle legacy technology and where the cloud should step in.
Bloomberg expands user access to new AI document search tool
An evolution of previous AI-enabled features, the new capability allows users to search terminal content as well as their firm’s proprietary content by asking natural language questions.
CDOs must deliver short-term wins ‘that people give a crap about’
The IMD Wrap: Why bother having a CDO when so many firms replace them so often? Some say CDOs should stop focusing on perfection, and focus instead on immediate deliverables that demonstrate value to the broader business.
BNY standardizes internal controls around data, AI
The bank has rolled out an internal enterprise AI platform, invested in specialized infrastructure, and strengthened data quality over the last year.