Data quality cannot be addressed with a broad governance sweep and should be seen from consumers’ point of view, says Roberto Maranca, managing director for enterprise data at GE Capital
Are data governance plans proving effective for improving data quality?
In its essence, data governance paves the way towards a change of behavior and a care for data that should yield better data quality. The challenges are centered around its prioritization and pervasiveness. Many firms are planning too aggressively—trying to gulp the entire breadth of data in one go with policies, edicts and compliance programs. Data governance is a huge change in the collective company behavior and should be tailored to the company so that it is progressive, but most importantly sustainable.
What is the most important element for addressing data quality, and why?
Data quality must be seen through the eyes of the data consumers. Too often, "good data quality" is discussed as an absolute value. Therefore quality-level agreements between data owners and data consumers should be brought about to detail what good looks like, in order to manage expectations and use company resources in the right direction. That should also simplify data quality communication, as an unqualified ocean of red, amber and green indicators can affect the confidence of those consuming the data.
Is automation having a negative effect on data quality?
Potentially, if automation increases the output of a process in terms of unit produced per time, and as a benefit is supposed to reduce the human touch-time in the whole process (e.g. risk auto-approval), there is a risk that a quality issue could be replicated on larger scale faster, and have a deeper impact. The solution is to design automation with more robust and tested quality control processes from the start.
Bill Murphy, CTO of Blackstone, once again joins the podcast to discuss the private equity firm's new offices, designed to house its innovations team.Subscribe to Weekly Wrap emails