Golden Copy: Were They Right About 2015?
Data management for financial services did see incremental development, particularly in data governance planning, setting the stage for 2016 expectations

Last week's column tried to predict what the major data challenges and management activity will be in 2016, based on industry leaders' and experts' views collected at the end of last year, an exercise Inside Reference Data undertook for the first time at the end of 2014. So it occurs to me that we should also look back at 2015 and consider whether the experts' thoughts were borne out over the course of the year.
Many of our experts the year before, from companies including ANZ Institutional Bank, BNP Paribas, Thomson Reuters, Rimes Technologies, Eagle Investment Systems, GoldenSource and HSBC, said BCBS 239, the European risk data aggregation guidelines, would receive—and require—a lot of attention. This regulation certainly did—last year started out with evidence of BCBS 239 driving data infrastructure changes, but continued with overall readiness to comply still lagging. So BCBS 239 remains a challenge.
The bigger question at this time last year, as identified by these experts, was whether reference data management advances would be incremental, if they happened at all. The development deemed most likely to occur was that data management technology would mature and the focus would center around integration of data sources and getting firms to establish data strategies or governance plans.
Last year, we found evidence that many firms were taking on data governance challenges. TIAA-CREF deployed an "acquisition and attrition" model. In that same story, Citi's Julia Bardmesser said data governance development helps support analytics and emphasized the importance of data standardization. Canadian firms, including TD Bank and Canadian Western Bank, found benefits from making data governance plans cross-functional.
As the year progressed, the industry also started to tie the idea of working on data governance to the need to address risk data management and regard data governance as a means to get a handle on data relevant to risk—in preparation for compliance with risk data aggregation guidelines. By the end of 2015, we were also hearing the industry's thoughts about what effect multiple data sources, and the need to reconcile those sources, can have on implementing data governance efforts.
This evolution in thinking about data management and data governance took more than an instant, although it did not take multiple years. Still, the challenge of centralization identified by experts looking ahead at 2016 means there still may be a lot more work to do concerning properly organizing data so that it can be beneficially managed by evolved and improved data governance plans.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
LSEG-AWS extend partnership, Deutsche Bank’s AI plans, GenAI (and regular AI) concerns, and more
The Waters Cooler: Nasdaq and MTFs bicker about data fees, Craig Donohue to take the reins at Cboe, and Clearwater closes its Beacon deal, in this week’s news roundup.
From server farms to actual farms, ‘reuse and recycle’ is a winning strategy
The IMD Wrap: Max looks at the innovative ways that capital markets are applying the principles of “reduce, reuse, and recycle” to promote efficiency and keep datacenters running.
Study: RAG-based LLMs less safe than non-RAG
Researchers at Bloomberg have found that retrieval-augmented generation is not as safe as once thought. As a result, they put forward a new taxonomy to help firms mitigate AI risk.
Friendly fire? Nasdaq squeezes MTF competitors with steep fee increase
The stock exchange almost tripled the prices of some datasets for multilateral trading facilities, with sources saying the move is the latest effort by exchanges to offset declining trading revenues.
Waters Wavelength Ep. 314: Capco’s Bertie Haskins
Bertie Haskins, executive director and head of data for Apac and Middle East at Capco, joins to discuss the challenges of commercializing data.
Nasdaq, AWS offer cloud exchange in a box for regional venues
The companies will leverage the experience gained from their relationship to provide an expanded range of services, including cloud and AI capabilities, to other market operators.
Bank of America reduces, reuses, and recycles tech for markets division
Voice of the CTO: When it comes to the old build, buy, or borrow debate, Ashok Krishnan and his team are increasingly leaning into repurposing tech that is tried and true.
Navigating the tariffs data minefield
The IMD Wrap: In an era of volatility and uncertainty, what datasets can investors employ to understand how potential tariffs could impact them, their suppliers, and their portfolios?