Data Utilities special report
Click here to download the PDF
Learning to Share
When I went to visit a bank to spend some time with their data analysts a few years back, I remember being excited about seeing a clever web link from the address field in the counterparty data management system to the UK postal organization Royal Mail's website. When analysts clicked the link, the address was automatically converted to the UK standard address format and verified. This feature enabled the data analysts to tap into data already aggregated and normalized by the Royal Mail. The reason I got so excited to see this was that it was my first experience with a utility approach, as postal services organizations could be seen as acting as utilities of address information.
When it comes to finding addresses, it is pretty standard to rely on postal services to have the correct information. Few would find it necessary to source this information elsewhere when the data is sourced and maintained by one reliable organization servicing a large number of customers. This is exactly what is needed in the reference data market in general. The data quality issue is increasingly viewed as being too big for one organization to fix on its own, and a recent WatersTechnology survey found that there are now more firms that would want to leverage a shared service than there are firms that want to do it all internally.
Firms are increasingly realizing there is no need for every firm to duplicate the work. It is about time to take the next step, and that step means replicating some of the concepts already recognized in other industries, such as postal services. It's not only about address information though. It's about various types of securities reference data and counterparty data that can be taken from a utility to share costs between many organizations and improve quality.
Right now, it sounds obvious that the Royal Mail is the only organization that needs to update addresses and firms can link into that database to ensure their information is correct too. At some point in the future, it will probably be as obvious that certain types of reference data are processed by one provider instead of being aggregated, normalized and enriched by every financial organization.
In this special report, Inside Reference Data has gathered the latest research on the topic and advice from industry experts on moving to a utility model.
Click here to download the PDF
If you would like us to create a custom special report for your firm on a key topic area to suit your business needs, please contact Jo Webb on +44 (0)20 7316 9474 or jo.webb@incisivemedia.com
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Deutsche Börse invests $200M in Kraken, DTCC advances cloud strategy, and more
A recap of this week’s major tech and data news in the capital markets.
Data industry spend hits $50B for first time in new report
A new product by BCG Expand will track market data vendor size and market share as it seeks to show data users where the market is heading.
TNS integrates Radianz, Exegy reduces latency, BondXN allies with BlackRock, and more
A recap of this week’s major tech and data news in the capital markets.
Re-engineering reconciliations: User-initiated AI cuts recs from days to minutes
Reconciliations have long been tied to batch scheduling. Prasanna Anandan explains how one bank broke down bottlenecks by embedding an AI-driven, user-initiated interface.
The public market data formula is coming to private markets
As interest in private markets grows, S&P Global, Bloomberg, and ICE are including increasingly valuable data in their offerings.
Tradefeedr pairs with BMLL to expand FX offering into equities, futures
Tradefeedr will also use BMLL’s historical data to help build out an LLM-powered chatbot.
Equity data plans eye Dec. 6 for overnight trading launch
The US SIPs are looking to launch near 24-hour operations as exchanges seek to extend their hours.
After the shuttering of Wilshire Indexes, the indexes space is a little tighter
The IMD Wrap: Max analyzes the winding up of Wilshire Indexes, a venture not yet three years old, and what the move means for the index industry and its consumers.