Data Utilities special report
Click here to download the PDF
Learning to Share
When I went to visit a bank to spend some time with their data analysts a few years back, I remember being excited about seeing a clever web link from the address field in the counterparty data management system to the UK postal organization Royal Mail's website. When analysts clicked the link, the address was automatically converted to the UK standard address format and verified. This feature enabled the data analysts to tap into data already aggregated and normalized by the Royal Mail. The reason I got so excited to see this was that it was my first experience with a utility approach, as postal services organizations could be seen as acting as utilities of address information.
When it comes to finding addresses, it is pretty standard to rely on postal services to have the correct information. Few would find it necessary to source this information elsewhere when the data is sourced and maintained by one reliable organization servicing a large number of customers. This is exactly what is needed in the reference data market in general. The data quality issue is increasingly viewed as being too big for one organization to fix on its own, and a recent WatersTechnology survey found that there are now more firms that would want to leverage a shared service than there are firms that want to do it all internally.
Firms are increasingly realizing there is no need for every firm to duplicate the work. It is about time to take the next step, and that step means replicating some of the concepts already recognized in other industries, such as postal services. It's not only about address information though. It's about various types of securities reference data and counterparty data that can be taken from a utility to share costs between many organizations and improve quality.
Right now, it sounds obvious that the Royal Mail is the only organization that needs to update addresses and firms can link into that database to ensure their information is correct too. At some point in the future, it will probably be as obvious that certain types of reference data are processed by one provider instead of being aggregated, normalized and enriched by every financial organization.
In this special report, Inside Reference Data has gathered the latest research on the topic and advice from industry experts on moving to a utility model.
Click here to download the PDF
If you would like us to create a custom special report for your firm on a key topic area to suit your business needs, please contact Jo Webb on +44 (0)20 7316 9474 or jo.webb@incisivemedia.com
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
CME rankles market data users with licensing changes
The exchange began charging for historically free end-of-day data in 2025, angering some users.
Data heads scratch heads over data quality headwinds
Bank and asset manager execs say the pressure is on to build AI tools. They also say getting the data right is crucial, but not everyone appreciates that.
Reddit fills gaping maw left by Twitter in alt data market
The IMD Wrap: In 2021, Reddit was thrust into the spotlight when day traders used the site to squeeze hedge funds. Now, for Intercontinental Exchange, it is the new it-girl of alternative data.
Knowledge graphs, data quality, and reuse form Bloomberg’s AI strategy
Since 2023, Bloomberg has unveiled its internal LLM, BloombergGPT, and added an array of AI-powered tools to the Terminal. As banks and asset managers explore generative and agentic AI, what lessons can be learned from a massive tech and data provider?
ICE launches Polymarket tool, Broadridge buys CQG, and more
The Waters Cooler: Deutsche Börse acquires remaining stake in ISS Stoxx, Etrading bids for EU derivatives tape, Lofthouse is out at ASX, and more in this week’s news roundup.
Fidelity expands open-source ambitions as attitudes and key players shift
Waters Wrap: Fidelity Investments is deepening its partnership with Finos, which Anthony says hints at wider changes in the world of tech development.
Data standardization key to unlocking AI’s full potential in private markets
As private markets continue to grow, fund managers are increasingly turning to AI to improve efficiency and free up time for higher-value work. Yet fragmented data remains a major obstacle.
Digital employees have BNY talking a new language
Julie Gerdeman, head of BNY’s data and analytics team, explains how the bank’s new operating model allows for quicker AI experimentation and development.