No Absolutes About Value
Taking an overarching view of data governance plans means deciding what approach offers more value
I was reminded recently what a rarity it has become for financial services firms to design and build their own data management systems from scratch. So it is striking to see an experienced industry leader, Brian Buzzelli, in his new role with Acadian Asset Management, speak of keeping that approach in place at this firm. Service providers keep growing though, as is evident in our news story on page 5, about client data management software provider Fenergo reaping new backing because of its growth over the past three years.
Another item firms are working on, however, is building governance plans, as recalled in “Calling All Sponsors.” These, by necessity, cannot be farmed out so easily. Firms must understand all the components being put into place and devise a strategy for managing throughout their operations, says Cal Rosen, vice president of enterprise data governance and data quality at TD Bank. His colleague, Paul Childerhose, a data governance director at Scotiabank, says data governance planning shouldn’t only be done by choosing technology providers, but should begin with “planning the journey.”
One area in which firms’ aptitude appears to be maturing is handling evaluated prices, as recounted in “Assessing Evaluations.” Pricing providers, such as Thomson Reuters, are becoming aware of their users’ demands for consistency regardless of asset class, country or region. As Daniel Johnson, head of valuation at Wells Fargo Global Fund Services in London, observes in this story, user demand will drive providers to cover more asset classes, and to derive prices by comparing multiple suppliers’ information—not regulatory requirements about sourcing this data.
Of all the regulations and standards Inside Reference Data encounters, the one that is surprisingly new on our radar is AnaCredit, which is an initiative by the European Central Bank (ECB) to create an analytical credit risk data set that can support the bank’s research, as the body setting monetary policy for the European Union countries as a whole. With the stages for implementation of AnaCredit currently set for 2017, 2019 and 2020, it’s not yet reasonable to expect a lot of data to have flowed in from firms for this credit risk barometer. With the Corep and Finrep reporting guidelines already established, however, firms should not have to start from scratch for their AnaCredit submissions.
Turning to something we do hear about quite often, this month’s “Industry Warehouse” columnist, Robert Iati of Dun & Bradstreet, talks about “big data” in the context of how banks start data projects, warning of insufficient attention being paid to reference data. The increased amount of, and emphasis placed on, pricing information, could be making that more important than reference data information, Iati observes. Big data about customers, counterparties and products could be waning in value, if compared with generation of profits through trading.
The underlying concern in all the data management efforts and decisions reported in all of these stories is the value of data—whether it’s the value to be derived by handling data better (when planning governance), just how valuable the data is (because of the checks and balances of multiple sources, or what it means to a central bank), or the value that a service provider can offer to a user.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Waters Wrap: Market data spend and nice-to-have vs. need-to-have decisions
Cost is not the top factor driving the decision to switch data providers. Anthony looks at what’s behind the evolution of spending priorities.
The consolidated tapes are taking shape—but what shape exactly?
With political appetite established on both sides of the Channel, attention is turning to the technical details.
Ignoring ESG data simply doesn’t make business sense
There’s a brewing controversy about “woke” ESG investments. But politics aside, ESG as a dataset brings more transparency to investment decisions.
The Cusip lawsuit: A love story
With possibly three years before the semblance of a verdict is reached in the ongoing class action lawsuit against Cusip Global Services and its affiliates, Reb wonders what exactly is so captivating about the ordeal.
S&P tests ‘ChatIQ’ genAI search tool in revamped CapIQ desktop
The new tool is the culmination of integrating recent acquisitions, including IHS Markit, ChartIQ and Kensho.
Getting the message out about Aeron’s messaging
A year after Adaptive’s acquisition of Real Logic’s Aeron, the search for users of the open-source tech continues as Adaptive promises that UDP messaging can usurp TCP.
Snowflake partners with BMLL in another big tech-fintech tie up
The collaboration comes as part of both companies’ desire to eliminate end users’ need for intensive data engineering, as well as costly in-house data storage.
ICE to offer ultra-low latency data between the US and Europe
The offering will run eastward, connecting routes in the US with European markets in London, Frankfurt, and Bergamo.