The Next Big Data Debate Emerges
The ongoing discussion about big data, which continued last week in Waters' Big Data Webcast, appears to be turning away from a debate between using cloud computing or the Hadoop standard to a concern with rapidly increasing volume and velocity of data creating a need for greater use of big data systems.
An unspoken context underlying the webcast discussion, which had participants from Credit Suisse, BNY Mellon, Intel, IBM's Platform Computing and Sybase, is that the industry already seems to be leaning or moving away from Hadoop and toward cloud as being more effective for handling big data.
"The cost per gigabyte of storing that transaction over time is pushing us into cheaper, non-SQL, big data-type solutions," said Ed Dabagian-Paul, a vice president at Credit Suisse who works on setting strategy and direction for technology infrastructure at the firm. "The traditional big data solutions haven't mapped to our problems. We can answer most of our existing problems with existing data analytics or very large databases."
Daryan Dehghanpisheh, global director of the financial services segment at Intel, identified "volume, variety, value and velocity" as the four pillars of big data. He had already noted volume, and processing speed and time as key areas for big data when speaking with us in November.
Intel works with partners to produce solutions for operational issues such as big data. According to Dehghanpisheh, the company aims to achieve complex machine learning, statistical modeling and graphing of algorithms within big data, rather than the traditional business intelligence of query reporting and examining historical data trends. Orchestrating use of metadata and setting data usage policies are important parts of administering big data operations, he adds.
An extensible framework is needed to manage the volume and velocity at which big data now pours forth, as Dennis Smith, managing director of the advanced engineering group at BNY Mellon, sees it. "There are tremendous cost benefits to this from a scale standpoint and particularly looking at volume use cases," he said. Cloud computing inherently offers greater scale, of course, and analytics can be layered onto it or attached to it. As Smith also explains, Hadoop-related technologies, or standalone analytics infrastructures and traditional data warehouses as staging areas may all be ways to manage big data in tandem with cloud resources.
The question to ask, or the discussion to have, now, is how to marry big data, sourced from or processed through the cloud, with analytical systems that can derive actionable meaning from the data, for all its increased volume and velocity.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
CME, LSEG align on market data licensing in GenAI era
The two major exchanges say they are licensing the use case—not the technology.
Data infrastructure must keep pace with pension funds’ private market ambitions
As private markets grow in the UK, Keith Viverito says the infrastructure that underpins the sector needs to be improved, or these initiatives will fail.
AI enthusiasts are running before they can walk
The IMD Wrap: As firms race to implement generative and agentic AI, having solid data foundations is crucial, but Wei-Shen wonders how many have put those foundations in.
Jump Trading spinoff Pyth enters institutional market data
The data oracle has introduced Pyth Pro as it seeks to compete with the traditional players in market data more directly.
50% of firms are using AI or ML to spot data quality issues
How does your firm stack up?
FCA files to lift UK bond tape suspension, says legal claims ‘without merit’
After losing the bid for the UK’s bond CT, Ediphy sued the UK regulator, halting the tape’s implementation. Now, the FCA is asking the UK’s High Court to end the suspension and allow it to fight Ediphy’s claims in parallel.
Waters Wavelength Ep. 339: Northern Trust Asset Management’s Jan Rohof
This week, Jan Rohof from Northern Trust Asset Management joins to discuss how asset managers and quants get more context from data.
Tokenization & Private Markets: Where mixed data finds a needed partner?
Waters Wrap: Reading the tea leaves, Anthony predicts BlackRock’s Preqin deal, Securitize’s IPO, and numerous public comments from industry leaders are just the tip of the iceberg.