The bank is one-third of the way through a three-year project to re-engineer its data management processes to become a more data-driven business.
At the recent Waters USA event, experts discussed how firms can leverage technology innovation to guide the data digitization process, and where human intervention remains important.
In conversation with Duco CEO Christian Nentwich and Waters editor in chief Victor Anderson, Citi's global head of operations and technology Don Callahan describes his efforts to influence the bank's approaches to managing data quality.
As the SEC prepares to host a two-day event to tackle market data access and fee issues, industry user groups and Nasdaq have laid out their positions on SIP reform. Max Bowie reports.
Alternative and ESG datasets hold the promise of delivering better and more predictable returns for investors, but are some firms underestimating the amount of work required to integrate these into their strategies?
The bank says its research app is already one of the most popular on Symphony's messaging platform.
Castro has more than 25 years of experience in systems development, markets infrastructure technology and exchange market data.
By offloading its data processing to Crux, Two Sigma will be able to accelerate data acquisition efforts.
State Street Global Exchange’s head of research says investors still struggle to measure ESG impact on their portfolios, despite growing demand for ESG insights.
Data is the third-largest expense for the financial industry, so firms are getting creative when it comes to cost control.
CompliancePoint's Greg Sparrow advises financial firms how to avoid hefty fines resulting from GDPR non-compliance.
Recent studies reveal the prevalence of poor-quality data, exacerbated by increased use of machine learning that allows users to dredge far bigger datasets and identify spurious correlations.
The bank’s asset management arm believes that trawling its home waters for data will land a valuable catch. Risk.net’s Faye Kilburn speaks to the data scientist at its helm.
Some fear the prospect of artificial intelligence taking traders' jobs. But, explains National Bank of Canada's Alexis Gouslisty, AI's greatest opportunities are in transforming the way banks manage data internally and how they interact with clients.
Max Bowie reports from Toronto on Canadian firms' opinions of the challenges associated with using alternative data.
Wei-Shen Wong documents the rise of Asia-based chief data officers, and their place in the structure of both local and global firms.
Beaton brings to the new role 15 years of expertise in financial services program management, regulatory and risk oversight, and audit trail recovery and management.
The RSU-GoldenSource service will allow participating German regional banks to use a multi-tenant, shared model to serve the needs of most of their reference pricing data.
The bank is creating a new group tasked with finding data within its securities division that could be sold to clients.
The bank will use Big XYT's Liquidity Cockpit platform to provide deeper insight into the European equities liquidity landscape.
Officials say Guardian will be able to eliminate a number of "cumbersome" data management and compliance processes as a result of implementing NeoXam's solutions.
Barr has 30 years of industry experience, much of that time in market data and enterprise data management roles.
Antenna will shorten the time needed to evaluate new datasets, and allow BAM to begin using the data faster than its rivals.
Jones spent a combined 22 years at Barclays and UBS in data and technology management roles.