Firms Increase Focus on Measuring Data Quality in Downstream Systems


Firms are increasingly starting to measure data quality downstream as data management programs are maturing, officials tell Inside Reference Data.

Data quality has traditionally been measured in the data repository, but not necessarily downstream. "There has been a lot of focus on building robust, quality foundation data stores, like data warehouses, but inadequate focus on making sure this ‘data store quality' carries through to consistent ‘data usage quality'," says Dublin-based Michael McMorr

To continue reading...