Panel: Firms Face Increased Complexity When Measuring Data Quality

Following the financial crisis, it has become more complex to measure data quality, with firms implementing advanced metrics to avoid a dramatic rise in exceptions, according to a panel of speakers at the Paris Financial Information Summit in June.

Panelists said regulators are still focused on data quality and completeness of data, but ensuring data is good quality has become more challenging with high levels of volatility and different ways to define data quality. Paris-based Philippe Rozental

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Sorry, our subscription options are not loading right now

Please try again later. Get in touch with our customer services team if this issue persists.

New to Waterstechnology? View our subscription options

Waters Wrap: GenAI and rising tides

As banks, asset managers, and vendors ratchet up generative AI experiments and rollouts, Anthony explains why collaboration between business and tech teams is crucial.