Aggregate to Accumulate

Risk data aggregation and the challenges in how to divide up data and who is assigned what pieces of data.

kate-toumazi-thomson-reuters
Kate Toumazi, global head of risk data services, Thomson Reuters

Risk data aggregation, as a practice and a means to get a better grasp on data for risk management purposes, may have to begin with dividing up data into different domains of a firm, and assigning responsibility for those different pieces of data. Industry professionals are finding a challenge in determining just how to divide up data and deciding what departments should be assigned what pieces of data.

Kate Toumazi, Global Head of Risk Data Services, Thomson Reuters, explains how firms should deal with enterprise-wide data that creates complications for dividing or aggregating.

1. How should data dictionaries or definitions be established as a foundation for data aggregation efforts?

In accordance with Basel, financial institutions must have an enterprise approach to how they manage their risk and have a robust system which utilizes consistent data across the entity. It is without question that having a strong data architecture is critical for risk data aggregation, and that a key facet of any firm wide data architecture is having consistent data dictionaries. However, the reality is that for most firms the technical challenges are compounded when differing data dictionaries are used across a firm. In an ideal scenario, firms would pick a best in breed dictionary and look to roll this out across their entire enterprise. This may mean tweaking existing capabilities such that a broader array of data can be harmonised into a single and more scalable model. A recent survey of globally systemically important banks (G-SIBS) further highlighted the challenge firms are facing, when it showed an increase in the number of banks who are unlikely to be compliant with BCBS 239 implementation by the 2016 deadline. In fact, more than half of those surveyed said they are not going to be ready. This truly underscores the complexity of the challenge, which is growing not shrinking and a need for a solution remains critical. We can all hear the regulatory clock ticking and firms need to work towards the best viable solution for their business given their existing infrastructures.


2. Who should the stakeholders be and what should their roles be, when assigning responsibilities for data domains?

To comply with Basel, firms must be proactive in how they provide governance and oversight to their risk systems, policies and procedures. They must truly own how they are measuring and mitigating risk. In light of this, one of the biggest organizational changes we have seen across numerous firms is the appointment of a Chief Data Officer who reports to or operates for the board. We believe this trend will continue for the following reasons firstly, by elevating the importance of the data function within the organization, firms are highlighting the strategic importance of getting it right. Secondly, and perhaps more importantly, it specifically assigns accountibility to a senior individual. It is clear that the stakeholders for risk data aggregation sit across numerous parts of the organization including risk, finance, IT and data operations and that these functions must all work together to create the overall structure and composition of the governance and delivery organization. Front office and back office are often not joined up and the front office specifically is often not incentivized to input accurate data which results in manual interventions later to correct the data. By having a single senior figure responsible for data across the organization many firms are looking to address these problems and are far more likely to succeed in spite of the fragmentation.


3. Can enterprise-wide data be broken down, scrutinized and reorganized to address risk management? How should that process work?

A bank should be able to generate accurate and reliable risk data to meet the necessary reporting requirements. To accomplish this, data should be aggregated on a largely automated basis to minimize errors.

Only with an enterprise wide view can the data be aggregated to truly address risk management. Fragmentation is public enemy number one when it comes to aggregated risk management. That does not necessarily mean the only solution to be able to scrutinize the data is a single granular data repository across the entire firm with a single risk management system feeding off this . We, for example, see many banks looking to technology solutions to create a federated model for a single data repository that will add a layer over and above their existing databases to try to create a single data model mid-way through the data lifecycle.

How far banks need to go towards this depends on how consistent their data models are and where they are looking to aggregate their risk either by country, by region, by group or some other level. Even if silos of data are not being physically broken down, one thing is certain, the way collection, storage and maintenance of the data is managed can no longer be done in a silo if firms are to fully address their risk management challenges.


4. What impact is the stress testing regimen of CCAR and BCBS 239 having on risk data aggregation efforts?

The Basel Committee on Banking Supervision states "risk data aggregation" is "defining, gathering and processing risk data according to the bank's reporting requirements to enable the bank to measure its performance against it risk tolerances/appetite". BCBS 239 is core to this statement and its specific data standards highlight the vital role data plays when implementing true Risk Data Aggregation.

The biggest impact we are seeing is increased investment in data aggregation. It goes without saying that post 2008 most major institutions were looking to see how they could improve their aggregation to avoid the same lack of transparency and inability to respond on a timely basis to market and credit risks, but today's regulations are adding the extra pressure. The fact that BCBS 239 has milestones requiring firms to report on their progress also means this has been top of the agenda.

The other facet of the regulations that is different to what may have been in place before is the explicit requirement to provide forward-looking assessment of risk to senior management. This includes forecasts or scenarios for key market variables and the effects on the bank, providing senior management with a much needed view of the likely trajectory of the firm's capital and risk profile in the future. This change adds yet another layer of complexity to what is already a substantial undertaking. It also drives further investment into the data aggregation efforts to ensure that not only are historic / current risk calculations and measures consistent but any future looking views are also modeled in a consistent way.

  • LinkedIn  
  • Save this article
  • Print this page  

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact [email protected] or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact [email protected] to find out more.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here: