Flood of Factors

Q&A with Deloitte's Dilip Krishna about risk data aggregation


Inside Reference Data speaks to Dilip Krishna, director at Deloitte, about how to best prepare risk data before aggregation, and how stress test requirements affect aggregation

Does it make sense to divide up risk data and evaluate or inspect it before aggregating it?

Risk data usually originates elsewhere in the organization, as booked trades, originated and serviced loans, etc. It is enriched in a number of ways, most pertinently by adding risk metrics to it. To ensure high levels of risk data quality, it is essential to ensure the raw input itself has high fidelity. Additionally, high
quality requires the aggregation process to be free from corruption, so both of these are necessary conditions to ensure the ultimate accuracy of risk data.

How should risk data be divided and organized to those ends?

Risk data has several components. The base input is the current actual financial state of the organization as represented by trading positions and loan balances. Risk metrics also depend on other important information such as client, facility and collateral information. In addition, to develop models for risk management, it is critical to have a sufficiently long historical record of such data (e.g., five years of loan history). Finally, external data may also be required to supplement internal historical data (e.g. operational loss history data).

Are the stress-testing requirements of CCAR and BCBS 239 driving more attention to risk data aggregation and getting more done in that regard?

Stress-testing requirements are driving significant changes in risk data aggregation infrastructures. These requirements go well beyond generating risk reports, and demand that banks perform a meaningful analysis on both inputs and outputs of stress tests. In addition, there is a timeliness requirement that is hard to meet. These requirements are usually difficult for banks to meet with existing infrastructures, prompting their focus on risk data aggregation systems. Since BCBS 239 is consistent with these requirements but states them more explicitly, both requirements are together driving more coherence in risk data aggregation infrastructures.

  • LinkedIn  
  • Save this article
  • Print this page  

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact [email protected] or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact [email protected] to find out more.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here: