As the industry is inundated with data management issues, putting effective internal governance policies and complying with global reporting obligations, have jumped to the top of banks’ agendas. Christopher Butler, chief data officer for Asia-Pacific International Markets at HSBC, explained how it is necessary for financial institutions to have consistent data governance frameworks across all parts of their business. This includes taking into account how data is captured, who owns the data, who has access to it, and measuring data quality consistently.
Although banks worldwide are mandated to comply with the Basel Committee on Banking Supervision (BCBS) regulations—such as BCBS 239, which stipulates principles for effective risk data aggregation and risk reporting—Butler said that the industry should take a step further in terms of improving global data governance.
Part of the challenge that banks have is that we have lots of data pools, but if you don’t tag that data and index it, how do you find it again?Chuck Teixeira
“We must continue to be broader and have [consistent] governance definitions across all data aspects,” he explained. “Especially for an organization like HSBC, it is impossible to consolidate and use the data from Bangladesh to Argentina to Ukraine if we don’t have that. So in terms of governance, ownership, definitions, and consolidation, it is critical across all operations.”
Speaking on a data governance panel at this year’s Asia Pacific Financial Information Conference (Apfic) in Hong Kong, Butler outlined parts of how the bank is building out its data lineage program, enabling it to have a granular view of its data across the entire organization. According to Butler, the bank can dig down into the data, extract important elements, and identify aspects such as the owner of the data.
“We can break the data elements into customers, corporations and individuals, or non-organizations,” he added. “We are able to use that framework to put an owner against it.”
Earlier this year, WatersTechnology also spoke to Chuck Teixeira, chief administrative officer and head of transformation at HSBC, about the organization’s global data transformation project where it is using machine-learning techniques to measure the quality of its data across five different dimensions—accuracy, completeness, uniqueness, validity, and consistency—and uses granular details to link correlated data. Teixeira outlined how the bank is using artificial intelligence to index and tag data from trillions of transactions and external resources to build a reusable gold source of data.
“Part of the challenge that banks have is that we have lots of data pools, but … if you don’t tag that data and index it, how do you find it again? So that is part of what we have built, a reusable data asset. And this has been a significant undertaking over the last year,” says Teixeira.
The bank is now shifting its data to a cloud-based data lake where it can leverage the environment’s scalability, accelerate operational processes, and develop new capabilities, such as a client intelligence utility, which is part of a wider client services project called Phoenix.
The founder and CEO of Imperative Execution looks at how trade execution is changing and what that means for the buy side.Subscribe to Weekly Wrap emails
- Waters Rankings 2019: All the Winners
- Mizuho Finds New Ways to “Activate” its Data Using AI
- Barclays Nearly Finished with First Major Quantum Computing Experiment
- Banks, Asset Managers Turn to Web Scraping to Generate Alpha
- The AI Ethics Dilemma: Banks Find a Fine Line Between New Tech and 'What's Right'