Kris Kang, Lead Data Scientist, Data License and KYC, Bloomberg
Shaun Rankin, head of data management, Citizens Bank
Sanjay Saxena, head of enterprise data governance, Northern Trust Corporation
Moderator: Max Bowie, editor, Waterstechnology.com
In the face of ever-increasing volumes of data and data demands from trading firms and regulators alike, ensuring high data quality is essential for enabling decision makers to take informed choices, ensuring accurate valuations, reducing trade failures and meeting regulatory demands. With the extended Markets in Financial Instruments Directive II deadline looming, improving data quality to assist with regulatory reporting issues has become a vital area of focus. As a result, it has become commonplace for financial institutions to incorporate a data quality component into their data governance activities. However, in a survey of delegates to the 2017 North American Financial Information Summit, 51 percent of data executives cited data quality as their biggest hurdle.
In this webinar, sponsored by Bloomberg, industry professionals examine the importance of defining what data quality means to your business and the role of data governance in ensuring good data quality across the firm.
- Qualifying quality—how are you measuring your data quality to ensure accuracy, consistency, reliability, appropriateness, relevance and completeness?
- Tools and products to help govern your data and asses quality levels—what measures are you taking to enhance the quality of your data?
- How are your data-quality improvement activities embracing people, process and technology?
- How is the international regulatory landscape challenging the industry to improve data quality?