Data Governance: Towards A Single Version of the Truth

How techniques such as EDM and MDM can help make sense of complex information


How is data governance catching on as a data management method, or not—and why?

Brian Buzzelli, head of data governance, Acadian Asset Management: The financial industry has been managing ever-increasingly complex classes of data from the time of the initial rise of markets and trading. We initially employed technology to establish the architecture and infrastructure, and our data management practices evolved from database management for applications within a function to enterprise data management (EDM) and comprehensive data management across global information architectures.

brian-buzzelli-acadian-asset-management-2015The data management discipline further evolved with master data management (MDM) techniques, where we focus on ‘mastering' classes of data-at times with more centrally controlled organizational structures-to promote the benefits of operational and cost efficiency as a corporate ‘data utility.'

As we continue to evolve the data management discipline, we recognize the need for a data governance framework that works in partnership with data management. Data governance has recently evolved to more formally provide the policies and standards, data quality measurements and metrics, clarity on roles and responsibilities, and changes to how we think about data, so we understand and maximize the value of our information assets.

We evolved to be a ‘pure information' industry and few touch the physical assets these days. We look to operational and data governance excellence in other industries such as pharmaceuticals, technology, and aerospace; our industry's data governance discipline is far less mature. Data governance is not just catching on, but rather it's part of our industry's natural evolution to a more structured, engineered, standardized and quantified approach to governing data management.

olivier-mathurin-aim-webOlivier Kenji Mathurin, product marketing manager at AIM Software: Much progress has happened in the past couple years with the emergence of the chief data officer (CDO) role. Where data governance was often perceived as optional, regulators and data vendors' requirements are putting the spotlight on it. It's not just reporting the data anymore, it's also about the compliance of the process to get the data.

But to work, data governance requires two elements: management commitment and recognizing data management as a business priority. This is where most firms still struggle. Firms that have gone through an EDM program often have an advantage because these programs are cross-functional and often lay the foundation for data governance: definition and implementation of data policies, data models and glossaries, ownership and stewardship functions, standards, processes and tools.

Paul McInnis, data management product manager, Eagle Investment Systems: Data governance is a crucial part of a successful data management framework. In fact, it is valued as a strategic asset and is almost universally recognized by investment management firms, although formal adoption of data governance strategies is only now starting to gain momentum. There are a number of drivers behind this growing interest, with organizations looking to gain material competitive advantages and relying on these efforts to achieve operational efficiencies.

paul-mcinnisHowever, there is a gap between intention and execution when it comes to data governance, with many firms struggling to get these initiatives off the ground. A recent survey we conducted with WatersTechnology highlighted three main obstacles: lack of senior management involvement; lack of resources; and not knowing where to start on drafting a policy. Drivers for broader adoption of data governance policies by investment management firms include the need to efficiently and consistently handle the increasing volume and pace of information. Firms that have ignored the need to establish such policies have experienced operational inefficiencies, client losses and data security breaches.

What are the strengths/weaknesses of MDM and EDM for supporting data governance?

Buzzelli: EDM focused on managing data across the enterprise with greater functional standardization in operational data processing. MDM complements EDM as a technology architecture strategy, an organizational efficiency enabler and as a ‘center of excellence' data management technique. MDM informs our strategy and drives toward ‘single version of the truth' data structures to link the enterprise to critical data stores (e.g., clients, accounts and holdings); it shapes organizational structures, supporting greater centralization of common data processing activities for efficiency gains and reduced cost; and it promotes deep data class expertise by organizing data practitioners around the full shape, life cycle and workflow of a class of data (e.g., entity, security master, pricing, books, risk, transactions, etc.)

Both MDM and EDM can benefit from, and work in conjunction with, data governance. Practitioners need to evolve and mature their data management discipline and develop data governance in our industry similar to that of others.

Mathurin: Both EDM and MDM have been strong in supporting data stewardship functions by automating the recognition of irregularities and exceptions and by providing a central issue resolution workflow. They offer workflow support, control, and traceability to processes previously reliant on spreadsheets and manual-intensive activities.

Where EDM had weaknesses was in the transparency of the data policies executed. It was often hard to understand which version of the data policy was applicable on a given date or hour, because they were often translated from documents into technical rules programmed into the database and other technical layers of the EDM system. Understanding if a tolerance was changed in a quality control often involved gathering multiple technical log files.

Asset-servicing firms using a model of working with external data owners, are pioneers in EDM. AIM has worked with a fund administrator that calculates net asset values for 2,400 funds by setting a pricing policy for each asset class of the fund. This policy defines how prices are sourced, selected and controlled for NAV calculation. The administrator embeds these data policies in service-level agreements with clients. Changes clients request must be applied and tracked. To respond to external auditors asking about prices selected on certain days, the firm needed a more flexible, controlled and transparent way to manage changes in data policy.

Modern EDM must treat data policies as first-class citizens. Business data owners must maintain a policy in their system without re-programming. The data policies can be modified in a single place, with approval workflows, version controlled, and with audit capabilities. The audit trail must point to the exact version of the policy in place at any given time.

McInnis: MDM and EDM help support data governance initiatives by ensuring consistency of data across the organization. MDM aggregates multiple sources of data in a central location to ensure there is a consistent view across the enterprise, while EDM enables the firm to have one version of the truth across the enterprise.

Having access across an enterprise to a single view of a dataset as well as the resulting "single version of the truth" is a logical component to the broader objective of aligning the business through a data governance strategy.

That said, both MDM and EDM have their limitations as far as their support for data governance is concerned. For example, potential challenges with MDM are the assignment of ownership of the mastered data, who provides guidance on how decisions are made on data usage, and how conflicts are remediated. EDM ensures consistency and reduces the risk of conflict, but it does not necessarily resolve the question over whether processes should be governed centrally or a more federal structure is optimal for an organization.

How should IBOR (investment book of record) serve as a data governance tool or support mechanism?

Buzzelli: Using IBOR to give immediate and ‘real-time' accurate positions has been historically challenging for reasons including data, technology and timing. The velocity of global capital movement demands we understand position, exposure, risk and obligation. This velocity, along with increasing regulatory demands, will drive more evolution of data management, data governance, global data infrastructure and application-processing technologies.

Mathurin: IBOR helps consolidate a firm's positions in one central place, acting as a "Positions Master." As such, an IBOR also ensures the execution of data policies-data quality rules per asset type, issue resolution workflows, roles, and ownership. A number of IBOR initiatives started as a data management initiative, as these systems specialized in acquiring, controlling, and publishing investment data of all kinds-including cash and asset positions and/or corporate actions.

McInnis: IBOR is not a new concept-it's something we've been helping clients with for well over a decade. That said, it has grown in popularity as the regulatory environment and management practices increasingly demand risk and exposure reporting at an enterprise level.

An IBOR gives organizations one view of their investment data and enables them to benefit from a single source of truth regarding their investment activity. This can be used by compliance, risk, and performance and attribution systems across the organization to improve controls and decision-making. As a result, an IBOR can be regarded as a by-product of data governance as it controls how data—in this case, investment data—is consumed by the organization.

Can data governance be effectively centralized or does it require a more dispersed, flexible approach?

Buzzelli: The management and culture of every firm is unique; thus, the centralized or distributed structure should align accordingly. However, data governance is best positioned for success with a balanced approach where governance authority, firm level policy, governance programs, standards and measurement, leadership and guidance are centralized. The balance of data governance is within each business function and includes definition and adoption of function-level data usage policies, specific business function service-level expectations and authority over data stewardship and data ownership. Regardless of organizational structure, data governance should be a shared development, adoption and promotion responsibility across the firm.

Mathurin: The issue relates more often to roles and processes rather than tools. Centralized systems provide a single place to ensure the compliant execution of the data policies. They also provide a single place for transparency, access, and connectivity to multiple sources.

Regarding the organization itself, structure must follow strategy: the process and the organization can be decentralized to allow ownership and stewardship functions to run at the level where it makes most sense. We have seen various configurations from a central approach to a federated approach-where local subsidiaries own and manage local fields such as tax, while common fields are managed by headquarter. These approaches are fully compatible with a central system.

McInnis: Data governance policies need to be flexible, period. However, whether policies are centralized or federated depends on the structure and size of the organization, and they can vary, depending on the data in question.

A large organization may choose to centrally manage certain datasets that are commoditized and utilized across the enterprise, such as security reference data or pricing data. Yet, that same organization may choose to let other more esoteric datasets be governed locally.

For example, if a quant group calculates data solely for its own usage and there are no dependencies on that data elsewhere, it makes sense for that group to set its own controls and guidelines for that particular dataset. If at some stage, that dataset does need to be consumed outside of the group, instituting a global governance policy may then make sense.

As is the case in this example, data governance policies have to be flexible and agile to react to, and reflect, both organizational and industry changes. We firmly believe that data governance is organic and ever-evolving, rather than a set of rules carved in stone. Accordingly, success in data governance requires ongoing oversight and systematic reviews to measure the completeness, accuracy, consistency and timeliness of the data.

How would you compare MDM, EDM, IBOR and centralized data governance as operational approaches?

Mathurin: All of these approaches are fully complementary, with EDM, IBOR and MDM focusing on the data stewardship functions on different data focus angles, and the centralized data governance supplementing and orchestrating these initiatives.

McInnis: These approaches all follow the concept of manufacturing centrally and distributing globally. Firms need a unanimous centralized view of their commoditized datasets, as well as how they are consumed and utilized across the enterprise. Specifically for investment management organizations, there should be no disagreement when it comes to the likes of security reference, trade, position or pricing data across the organization to ensure consistency and clarity of data, and the decision making that results from its use. Each approach supports that core aim.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact or view our subscription options here:

You are currently unable to copy this content. Please contact to find out more.

Firms step up non-compete use to protect tech, data IP

US states are increasingly banning or limiting the use of non-compete contracts, but financial firms are using them more frequently to safeguard proprietary tech and data assets—including the knowledge of the individuals who work on them.

Waters Wrap: Examining ASX’s CHESS do-over

The Australian exchange was the first exchange to be all-in on DLT—and the project failed. Anthony speaks with ASX’s Tim Whiteley to discuss the lessons learned and why he thinks the second attempt will succeed.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here