Optimal Integration: Adding Reference Data Utilities to Your Data Management Workflow
Adam Cottingham outlines the key steps to integrating a reference data utility to optimize your data management strategy
Financial firms are under pressure to balance operational costs with a tighter business yield as the market environment continues to become more conservative. Increasingly, data management is being seen as an area suitable for innovation. By improving current reference data systems and processes through the use of mutualized data processing, it is possible to reduce inefficiencies, cut costs and optimize operations. But how should firms approach using a reference data utility, and how can such a service be best integrated into current systems?
Creating an enterprise-wide data governance strategy is a critical business activity and a solid first step towards gaining access to quality reference data. Many firms need to find ways to reduce their reliance on costly proprietary infrastructure that does not address the data problems currently plaguing the industry, such as the growing need for accurate and timely data to support business and trading execution, regulatory compliance and straight-through processing through the trade lifecycle. Today, firms are seeking an industry-wide version of the truth, and reference data utilities can provide this much-needed access.
Generate ‘Mutualization’
By offering a more cost-effective, less resource-intensive way to achieve this, reference data utilities are fast becoming crucial to streamlining internal data management systems and processes. Utilities provide access to data processing that has been pushed upstream to apply proactive validated and enrichment routines based on input from the industry. The utility uses client feedback, usage patterns and research to achieve agreed levels of quality, timeliness and integrity before packaging and distributing data for consumption. Since firms currently collect and manage these same datasets individually, a reference data utility eliminates this duplication of effort by providing all users with access to an industry version of the truth before the data impacts downstream processing. This “mutualization” not only cuts costs, but also improves data quality and timeliness for all users. Importantly, for those who wish to maintain control, the data and access can be tailored to meet firms’ specific operational and infrastructural needs, incorporating any necessary regulatory obligations.
A utility enables consumers to cooperate and set standards for data validation checks related to vendor variables, instrument specifications and asset class-specific standards. Enrichment routines are commonly established to supplement data related to product-specific and issue-level attributes, along with rule-based derivations for dynamic variables. This allows firms to prevent poor-quality data from polluting the transaction cycle. It uses a positive network effect of accurate data to give firms access to high-quality, usable data that represents an industry truth.
Creating an enterprise-wide data governance strategy is a critical business activity and a solid first step towards gaining access to quality reference data.
Create Business Buy-in
The main—and often toughest—hurdle to integrating reference data utility services into firms’ current data management systems is typically gaining buy-in from management. This involves deciding on the strategy to pursue the best method of execution, and finding an efficient mechanism to disseminate and integrate changes to the data environment. The data consumer is the key stakeholder in this and primarily needs to see the market in a timely way with accurate and complete data. Data management professionals must communicate the benefits of optimizing current reference data infrastructure with a utility that can meet data consumers’ needs.
Enable Client Control
While many firms want to address reference data issues and inconsistencies in the market and reduce internal legacy system costs, losing control remains a major concern. Firms have specific needs and require different levels of service,, which utilities can take into account. A firm might want access to a pipeline of data, or it could be searching for a way to completely outsource its systems. Alternatively, it might prefer to apply the utility’s standards and integrate them into current systems. This allows clients to maintain control of their data management strategy by adopting tailored packages that evolve with market changes and regulations through the use of custom service-level agreements. These need to be results-based to measure the success of the initiative, rather than simply stating the inputs, as typically happens with traditional managed service.
For reference data, the ability to access an agreed, industry-wide version of the truth is an attractive proposition. Traditional sourcing methods are incomplete and out of sync with thow the market behaves within the transactional process. By moving data sourcing and management upstream to a utility, financial firms can reduce the costs and mitigate the risks arising from running proprietary processes on internal legacy systems. Reference data utilities provide a cost-effective and robust, solution that also allows firms to remain in control and to be flexible in response to changing market conditions.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Data heads scratch heads over data quality headwinds
Bank and asset manager execs say the pressure is on to build AI tools. They also say getting the data right is crucial, but not everyone appreciates that.
Reddit fills gaping maw left by Twitter in alt data market
The IMD Wrap: In 2021, Reddit was thrust into the spotlight when day traders used the site to squeeze hedge funds. Now, for Intercontinental Exchange, it is the new it-girl of alternative data.
Knowledge graphs, data quality, and reuse form Bloomberg’s AI strategy
Since 2023, Bloomberg has unveiled its internal LLM, BloombergGPT, and added an array of AI-powered tools to the terminal. As banks and asset managers explore generative and agentic AI, what lessons can be learned from a massive tech and data provider?
ICE launches Polymarket tool, Broadridge buys CQG, and more
The Waters Cooler: Deutsche Börse acquires remaining stake in ISS Stoxx, Etrading bids for EU derivatives tape, Lofthouse is out at ASX, and more in this week’s news roundup.
Fidelity expands open-source ambitions as attitudes and key players shift
Waters Wrap: Fidelity Investments is deepening its partnership with Finos, which Anthony says hints at wider changes in the world of tech development.
Data standardization key to unlocking AI’s full potential in private markets
As private markets continue to grow, fund managers are increasingly turning to AI to improve efficiency and free up time for higher-value work. Yet fragmented data remains a major obstacle.
Digital employees have BNY talking a new language
Julie Gerdeman, head of BNY’s data and analytics team, explains how the bank’s new operating model allows for quicker AI experimentation and development.
Can mastering data solve AI’s cognitive dissonance?
The IMD Wrap: Bank execs are still bullish on AI, but recent studies suggest it’s not the panacea they’re making it out to be. Can the two views be rectified?