Waters gathered leading industry experts to discuss the biggest challenges and latest developments in the big data space in a webcast on 19 April
With a growing volume and variety of data in financial institutions, firms are increasingly looking at new ways to manage and take advantage of the information to reduce risk and increase revenues. The challenge is to identify the best ways to analyze and extract useful knowledge from terabytes, or even petabytes, of data. From a technical perspective, Big Data raises a number of big challenges in terms of maximizing efficiency of the architecture and ensuring sufficient processing power for large-scale model calculations. To achieve this, firms have to consider what tools would be best for that job - given their current states.
- Consolidating data stores without giving up performance and scale that they need.
- Incorporating additional information sources to provide better context to make real-time informed decisions.
- How to make a single source of truth to bring processing to the data -- versus the data to the processing.
* Edd Patterson, Chief Technology Officer, MARKLOGIC
* Howard Halberstein, Vice President, Lead Solutions Architect - UNIX, DEUTSCHE BANK
* Dennis Smith, Managing Director, Advanced Engineering Group, BNY MELLON
* Moderator: Victor Anderson, Editor-in-Chief, WATERSTECHNOLOGY
- CAT Selection Shows Industry's Interest in Change
- Natural Language Processing: An Inside Look at How NLP is Used in the Capital Markets
- CFTC's Giancarlo: Regulators Need to Help Distributed-Ledger Technology Grow
- Raymond James Taps AxiomSL for Data Aggregation, Reporting
- TRG Expands FITS Beyond Data, Adds Task Management, Storage Tools