Maximizing Metadata

Unlike the "telephony metadata" at the center of the US National Security Agency (NSA) surveillance controversy of recent weeks, the use of financial operations metadata should not reap global criticism.
Metadata can commonly mean a set of data comprised of attributes for each piece of data. In phone records, as was emphasized in the coverage of the NSA story, this means items such as length of calls, time of day and frequency of calls between the same parties. But for financial operations data, as discussed by attendees of last week's Sifma Tech Expo, this can be data about the parties to a transaction whose price is the starting, original data element, or other descriptive data about those transactions.
For instance, metadata can mean attributes created by an outside service provider to better enrich and calculate financial transaction data, as Eagle Investment Systems would define it, according to Jeremy Skaling, head of product management at the data technology and services provider. The company also sees metadata as a commodity that can be collected at a central point or utility, such as its Metadata Center service within its data management product.
Metadata may also be thought of as a categorization of firms' customer data to be available for linking to transaction data and other types of data, as Bob Molloy, associate partner, strategy and transformation, IBM Global Business Services, stated during the Sifma conference.
"For almost all our clients, when they put in compliance systems, they do it for that one system—with point-to-point linkage," he says. "All of a sudden, that won't work anymore. You must have flexible infrastructure. Being able to tie in metadata is becoming more important because you have to be able to link these records together effectively to be able to find all of them."
Capturing metadata has also become an important part of using the Data Management Maturity (DMM) model now taking hold at firms in the industry. Bank of America chief data officer John Bottega included the capture of metadata as a key element when building a new data governance program built on the DMM model last year.
The DMM model, released last year after three years in development, defines the parts, processes and capabilities necessary for effective data management. The model provides criteria for evaluating data management goals. Organizations are deriving value from the model itself, but have to think about metadata traits on top of the DMM model to really achieve the goals that the model's developers are aiming for—better data management to avoid the risks that caused damage to the industry in 2008.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Growing pains: Why good data and fortitude are crucial for banks’ tech projects
The IMD Wrap: Max examines recent WatersTechnology deep dives into long-term technology projects at several firms and the role data plays in those efforts.
Investing in the invisible, ING plots a tech renaissance
Voice of the CTO: Less than a year in the job, Daniele Tonella delves into ING’s global data platform, gives his thoughts on the future of Agile development, and talks about the importance of “invisible controls” for tech development.
Optiver relies on BMLL market data for quant strategy
The market-maker has built its trading business on top of BMLL’s Level 3 data. But the collaboration is young, and the pair have grand plans to make options the next quant frontier.
Bloomberg expands IBVAL; the SIPs and 24/5 trading; Broadridge’s agentic play, and more
The Waters Cooler: State Street embraces interop, Citi’s CIO outlines the XiNG risk platform, power companies explore alternative nuclear supply options to datacenters, and more.
As costs rise, buy-side CIOs urge caution on AI
Conference attendees encouraged asset managers to tread carefully when looking to deploy AI-driven solutions, citing high cost pressures.
XiNG: Inside Citi’s all-encompassing risk platform
Voice of the CTO: Citi’s chief information officer, Jon Lofthouse, explains how and why the bank has extended its enterprise-wide risk platform so that every trade in any asset class goes through it.
Demand for private markets data turns users into providers
Buy-side firms seeking standardized, user-friendly datasets are turning toward a new section of the alternatives market to get their fix—each other.
LSEG-AWS extend partnership, Deutsche Bank’s AI plans, GenAI (and regular AI) concerns, and more
The Waters Cooler: Nasdaq and MTFs bicker about data fees, Craig Donohue to take the reins at Cboe, and Clearwater closes its Beacon deal, in this week’s news roundup.