Data Utilities special report

Click here to download the PDF
Learning to Share
When I went to visit a bank to spend some time with their data analysts a few years back, I remember being excited about seeing a clever web link from the address field in the counterparty data management system to the UK postal organization Royal Mail's website. When analysts clicked the link, the address was automatically converted to the UK standard address format and verified. This feature enabled the data analysts to tap into data already aggregated and normalized by the Royal Mail. The reason I got so excited to see this was that it was my first experience with a utility approach, as postal services organizations could be seen as acting as utilities of address information.
When it comes to finding addresses, it is pretty standard to rely on postal services to have the correct information. Few would find it necessary to source this information elsewhere when the data is sourced and maintained by one reliable organization servicing a large number of customers. This is exactly what is needed in the reference data market in general. The data quality issue is increasingly viewed as being too big for one organization to fix on its own, and a recent WatersTechnology survey found that there are now more firms that would want to leverage a shared service than there are firms that want to do it all internally.
Firms are increasingly realizing there is no need for every firm to duplicate the work. It is about time to take the next step, and that step means replicating some of the concepts already recognized in other industries, such as postal services. It's not only about address information though. It's about various types of securities reference data and counterparty data that can be taken from a utility to share costs between many organizations and improve quality.
Right now, it sounds obvious that the Royal Mail is the only organization that needs to update addresses and firms can link into that database to ensure their information is correct too. At some point in the future, it will probably be as obvious that certain types of reference data are processed by one provider instead of being aggregated, normalized and enriched by every financial organization.
In this special report, Inside Reference Data has gathered the latest research on the topic and advice from industry experts on moving to a utility model.
Click here to download the PDF
If you would like us to create a custom special report for your firm on a key topic area to suit your business needs, please contact Jo Webb on +44 (0)20 7316 9474 or jo.webb@incisivemedia.com
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Study: RAG-based LLMs less safe than non-RAG
Researchers at Bloomberg have found that retrieval-augmented generation is not as safe as once thought. As a result, they put forward a new taxonomy to help firms mitigate AI risk.
Friendly fire? Nasdaq squeezes MTF competitors with steep fee increase
The stock exchange almost tripled the prices of some datasets for multilateral trading facilities, with sources saying the move is the latest effort by exchanges to offset declining trading revenues.
Waters Wavelength Ep. 314: Capco’s Bertie Haskins
Bertie Haskins, executive director and head of data for Apac and Middle East at Capco, joins to discuss the challenges of commercializing data.
Nasdaq, AWS offer cloud exchange in a box for regional venues
The companies will leverage the experience gained from their relationship to provide an expanded range of services, including cloud and AI capabilities, to other market operators.
Bank of America reduces, reuses, and recycles tech for markets division
Voice of the CTO: When it comes to the old build, buy, or borrow debate, Ashok Krishnan and his team are increasingly leaning into repurposing tech that is tried and true.
Navigating the tariffs data minefield
The IMD Wrap: In an era of volatility and uncertainty, what datasets can investors employ to understand how potential tariffs could impact them, their suppliers, and their portfolios?
Project Condor: Inside the data exercise expanding Man Group’s universe
Voice of the CTO: The investment management firm is strategically restructuring its data and trading architecture.
Tariffs, data spikes, and having a ‘reasonable level of paranoia’
History doesn’t repeat itself, but it rhymes. Covid brought a “new normal” and a multitude of lessons that markets—and people—are still learning. New tariffs and global economic uncertainty mean it’s time to apply them, ready or not.