The EU regulator had to coordinate efforts with local NCAs to clamp down on failures made by counterparties to meet Emir guidelines for porting data.
This whitepaper outlines the history of the Reference Data Utility (RDU), charts its development, scrutinizes its underlying technology and articulates the business benefits that users can reasonably expect on the back of subscribing to it
Market participants will not have to use both the UPI and the Isin in their submissions to trade repositories, policy officer says.
A new era of accountability might see the Federal Reserve demand model explainability to keep financial system safe.
A summary of some of the past week's financial technology news.
Managing Data Assets in 2021—Building an Effective Framework for Successful Data Governance and Sharing Best Practices
This webinar explores how data analytics and artificial intelligence tools can help buy-side firms identify hidden value within their data and how buy-side firms can enhance existing business processes by improving data quality
Three fixed-income experts look back on the bond market’s liquidity crisis spurred by the pandemic’s early days, and ponder where regulation and data-quality efforts might next lead the space.
FactSet’s mapping service, Concordance, can be used on its own or with Snowflake’s cloud, enabling users to apply the offering to other datasets in Snowflake’s ecosystem.
The firm is working with different machine-learning methods for portfolio construction, and expects its AI system to go live early next year.
The company aims to show that pairing good quality data with knowledge graphs can lead to links that previously would have been missed.
Sell-side firms and data providers are increasingly experimenting with natural-language generation to create new forms of automatically curated reports, emails and alerts, but the technique comes with significant challenges.
The asset manager has adopted materiality tools, industry handbooks, and NLP techniques to help navigate ESG data limitations.
The two banks outline their ambitious data governance programs, which make business professionals culpable for their organization's data decisions.
Experts from IBM and Bank of China say they're on the lookout for this emerging threat, as machine learning gains in popularity.
Instead of waiting for data quality to be sufficient to power AI models, those at the cutting edge are building models to bridge the gaps in the data, and apply it to more sophisticated use cases.
This survey report illustrates how data quality and technology are the keys to delivering improved corporate actions accuracy and transparency, while also significantly reducing latency
The vendor is also incorporating micro frontends, as well as exploring the use of machine learning in the future.
The regulatory business is developing enhanced analytics to improve reporting accuracy and identify signs of market manipulation.
Slashing budgets will lead to inaccuracies as banks turn to alt data for fraud detection and to monitor customer behavior during the coronavirus crisis.
Experts advise using machine learning to solve data quality challenges before applying it to alpha-generating strategies.
The financial industry is losing faith in the LEI initiative as regulatory mandates remain patchy, but some see hope in SFTR’s unique-issuer LEI. By Mariella Reason
WatersTechnology looks at more than 20 cloud-based projects and initiatives to see how banks, asset managers and vendors are embracing public providers, and the inherent problems involved.
Sibos 2019 was a significant event for SmartStream Technologies, marking the official launch of SmartStream Air, the firm’s cloud-native, AI-enabled reconciliations platform that is set to shake up the reconciliations industry. Victor Anderson caught up…
Many in financial services are trialing artificial intelligence (AI) applications, with projects increasingly sophisticated in methodology and ambition. WatersTechnology, in partnership with SmartStream, recently convened a Chatham House-style discussion…