Market participants say they want a high-quality, centralized source of market data for EU equities. But who and what is it actually for?
The global data giant is working with clients to deploy its data services from the three main cloud providers.
What does the bourse’s planned purchase of the data giant mean for clients and the industry going forward?
As alternative data companies battle for capital and a coveted spot in investment managers’ portfolio strategies, they are turning to bespoke marketing and partnerships to stand out in an industry where firms still struggle with data science resources.
The bank is one-third of the way through a three-year project to re-engineer its data management processes to become a more data-driven business.
At the recent Waters USA event, experts discussed how firms can leverage technology innovation to guide the data digitization process, and where human intervention remains important.
In conversation with Duco CEO Christian Nentwich and Waters editor in chief Victor Anderson, Citi's global head of operations and technology Don Callahan describes his efforts to influence the bank's approaches to managing data quality.
As the SEC prepares to host a two-day event to tackle market data access and fee issues, industry user groups and Nasdaq have laid out their positions on SIP reform. Max Bowie reports.
Alternative and ESG datasets hold the promise of delivering better and more predictable returns for investors, but are some firms underestimating the amount of work required to integrate these into their strategies?
The bank says its research app is already one of the most popular on Symphony's messaging platform.
Castro has more than 25 years of experience in systems development, markets infrastructure technology and exchange market data.
By offloading its data processing to Crux, Two Sigma will be able to accelerate data acquisition efforts.
State Street Global Exchange’s head of research says investors still struggle to measure ESG impact on their portfolios, despite growing demand for ESG insights.
Data is the third-largest expense for the financial industry, so firms are getting creative when it comes to cost control.
CompliancePoint's Greg Sparrow advises financial firms how to avoid hefty fines resulting from GDPR non-compliance.
Recent studies reveal the prevalence of poor-quality data, exacerbated by increased use of machine learning that allows users to dredge far bigger datasets and identify spurious correlations.
The bank’s asset management arm believes that trawling its home waters for data will land a valuable catch. Risk.net’s Faye Kilburn speaks to the data scientist at its helm.
Some fear the prospect of artificial intelligence taking traders' jobs. But, explains National Bank of Canada's Alexis Gouslisty, AI's greatest opportunities are in transforming the way banks manage data internally and how they interact with clients.
Max Bowie reports from Toronto on Canadian firms' opinions of the challenges associated with using alternative data.
Wei-Shen Wong documents the rise of Asia-based chief data officers, and their place in the structure of both local and global firms.
Beaton brings to the new role 15 years of expertise in financial services program management, regulatory and risk oversight, and audit trail recovery and management.
The RSU-GoldenSource service will allow participating German regional banks to use a multi-tenant, shared model to serve the needs of most of their reference pricing data.
The bank is creating a new group tasked with finding data within its securities division that could be sold to clients.