Cutting Big Data Down to Size
Editor's View

Last week, I wrote about a choice that could need to be made between cloud resources and the Hadoop tool for managing and working with big data. A better question, or a better way to frame the debate, could be as a decision about what is the best way to make use of cloud computing for data management, especially for "big data."
Tim Vogel, a veteran of data management projects for several major investment firms and service providers on Wall Street, has focused views on this subject. He advises that the cloud is best used for the most immediate real-time data and analytics. As an example, Vogel says the cloud would be an appropriate resource if one was concerned with the most recent five minutes of data under volume-weighted average pricing (VWAP).
"The cloud isn't cheap," he says. "Its best use is not for data on a security that hasn't traded in two weeks. Unless the objective is to cover the complete global universe, like [agency broker and trading technology provider] ITG does." Vogel points to intra-day pricing and intra-day analytics as tasks that could be enhanced, accelerated or otherwise improved upon through use of cloud computing resources. Data managers should think of securities data in two layers—a descriptive or identification layer and a pricing layer—both of which have to be processed and filtered to generate usable data that goes into cloud resources.
The active universe of securities as a whole, which includes fundamental data and analytics on securities, is really a super-set of what firms are trying to handle in terms of data on a daily basis, observes Vogel. With that in mind, the task for applying cloud computing to big data could actually be making big data smaller, or breaking it down into parts—cutting it down to size. That certainly will cut down on the bandwidth needed to send and retrieve data to and from the cloud, and consistently reconcile local data and cloud-stored data.
If nothing else, this is certainly a different way of looking at handling big data. It is worth considering whether going against the conventional or prevailing wisdom could lead data managers to a better way. Inside Reference Data would like to know what you think about this. We've reactivated our LinkedIn discussion group, where you can keep up with new stories being posted online, live tweets covering conference discussions, and provide feedback to questions and opinion pieces from IRD.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Demand for private markets data turns users into providers
Buy-side firms seeking standardized, user-friendly datasets are turning toward a new section of the alternatives market to get their fix—each other.
LSEG-AWS extend partnership, Deutsche Bank’s AI plans, GenAI (and regular AI) concerns, and more
The Waters Cooler: Nasdaq and MTFs bicker about data fees, Craig Donohue to take the reins at Cboe, and Clearwater closes its Beacon deal, in this week’s news roundup.
From server farms to actual farms, ‘reuse and recycle’ is a winning strategy
The IMD Wrap: Max looks at the innovative ways that capital markets are applying the principles of “reduce, reuse, and recycle” to promote efficiency and keep datacenters running.
Study: RAG-based LLMs less safe than non-RAG
Researchers at Bloomberg have found that retrieval-augmented generation is not as safe as once thought. As a result, they put forward a new taxonomy to help firms mitigate AI risk.
Friendly fire? Nasdaq squeezes MTF competitors with steep fee increase
The stock exchange almost tripled the prices of some datasets for multilateral trading facilities, with sources saying the move is the latest effort by exchanges to offset declining trading revenues.
Waters Wavelength Ep. 314: Capco’s Bertie Haskins
Bertie Haskins, executive director and head of data for Apac and Middle East at Capco, joins to discuss the challenges of commercializing data.
Nasdaq, AWS offer cloud exchange in a box for regional venues
The companies will leverage the experience gained from their relationship to provide an expanded range of services, including cloud and AI capabilities, to other market operators.
Bank of America reduces, reuses, and recycles tech for markets division
Voice of the CTO: When it comes to the old build, buy, or borrow debate, Ashok Krishnan and his team are increasingly leaning into repurposing tech that is tried and true.