Open Platform: 20/20 by 2020 Part II

Level Four: Common Simulation Engines
Most top-tier investment banks are moving toward common pricing models for both trading and risk. While shared pricing libraries have enabled this in the past, the need for improved scalability is now driving the use of grid technology. This provides an opportunity for further standardization, including common simulation engines—the vehicles within which valuation models are most often invoked. These engines generate scenarios, execute stress tests, manage results cubes, and perform aggregations and post-simulation statistical analysis.

If the valuation and risk implementation is standardized, any discrepancy in results is reduced to a debate about the specific version of calibrated data used for a calculation. This is a simpler discussion than a debate about model validity, and reduces reconciliation issues to data quality and interpretation.

Many banks that use grid technology still deploy multiple risk engines and multiple valuation models in the same grid. This may seem surprising since one of aims of using grid technology is to standardize. There is obviously a long journey for such banks before valuation and risk models come down to one implementation per product. But this is doable. It is a re-engineering exercise that improves the organization’s scalability, adaptability and flexibility, making better use of infrastructure and reducing the cost base. It also sets up the organization for future investment.

Level Five: The Valuation Service
Rates and equity businesses both need to price interest-rate swaps in real time at any point during the day—and many banks do, but they use different models, engines and data. The ultimate answer to the problem of different models pricing the same product is to have not only one version of the model and the data, but also a single instance of it. Implementing a centralized valuation service requires not simply designing and creating it, nor just providing for the operational demands, but also a progressive re-engineering of all systems that call upon it. This is a potentially huge effort, so the end must justify the means.

But if an organization wants single-instance centralized valuations, then—with senior management support—it can be implemented step-by-step. It does not require a big-bang change; implementation can be gradual and with many parallel work streams. For example, it can be simultaneously worked on at financial product, system and business-unit levels. At this level of risk management competence, an organization has consistent risk engines, pricing models and data, and a framework that is robust in the face of internal conflicts of interest. All the bank’s functions are on the same page.

While this may be nirvana for some, it is anathema to others. The primary objection relates to ownership of valuation and the necessity of innovation. Is it possible to have a centralized and independent valuation service, yet still be adaptable, flexible and creative in product design?

The solution is for the front office to view the centralized valuation service as an outsourcing arrangement that works to an agreed service level. You then have the ability to specify a service that implements new products quickly. Imagine a just-in-time facility that produces fully tested production-quality implementations of front-office prototypes within five business days. This is not fanciful thinking—it has already been done.

With this kind of technology, you can attempt larger financial engineering projects because you are sharing a firm-wide resource. It may be possible to build a competitive advantage that less competent organizations cannot even attempt. But an outsourced paradigm requires a different, more disciplined style of management thinking. It does not lend itself well to chaotic and uncontrolled prototyping.

Level Six: Global Valuation
All the above describes the work in progress at investment banks today and the path that these endeavors are taking them along. However, looking even further ahead, there may be reasons to reconsider the wisdom of these undertakings. Even if level five is perfectly realized—a single set of data and models used globally and consistently—every pricing model type is still inconsistent between asset classes.

Every valuation formula makes assumptions about the future and draws on a set of observable market data. But risk-factor modeling varies from product to product and the range of instruments used for calibration is narrow. So, while each instrument is correctly priced in isolation, from the bank-wide perspective there will be inconsistencies between valuations.

The necessary restructuring of pricing techniques to ensure consistency across risk-factor modeling and broader data samples for calibration requires high-throughput computing (HTC)—processing power that has only recently become available. HTC, the generalized application of graphics processing units (GPUs), provides a big change in computational power, provided that the models can be redefined to fit the GPUs’ way of working.

HTC is able to process matrices of data concurrently and very rapidly, but with less precision and functionality than traditional high-performance computing solutions—i.e., CPUs in a grid. In practice, this means that closed-form mathematical models cannot be used, and so we need to revert to numerical approaches—i.e., simulations.

It is possible to create a “risk landscape” where every possible future state of every risk factor is calculated and weighted and then calibrated to a broader cross-section of tradable instruments (Towards a Global Valuation Model, Claudio Albanese, Guillaume Gimonet and Steve White—Risk magazine, May 2010). Both the scenario generation for the risk landscape and its calibration use matrix multiplication techniques based on HTC technology.

This approach provides a common solution for the full range of front- and middle-office analytical needs, solves a common range of existing problems, and opens up exciting new possibilities such macro hedging and the ability to manage counterparty credit risk in real time.

Beyond Level Six
Over the time it will take for investment banks to build a common valuation infrastructure, parallel developments in complexity economics—The Origin of Wealth by Eric D. Beinhocker provides an overview of the subject—and the use of genetic algorithms will converge with risk-factor modeling. This will transform risk modeling into a mission-critical core competency. Calibration will become a tool to identify and quantify this economic value not only on the basis of historical performance, but also on the basis of future market expectations. In some sense, this is a forward-looking alternative to back-testing. Complexity economics will become a key skill for banks to develop and a new focal point for careers to be built around.

The Future of Risk
The risk practices of all financial institutions are somewhere on the competency ladder described above. A responsible approach to risk management includes plans to move up this ladder.

Ultimately, how far you wish to go depends upon the time frame over which you see your investment. A business case can be made considering the cost of capital as well as improvements in realized profit-and-loss (P&L). If you are looking several years ahead, then aiming high now will give you a real advantage in the future. A more ambitious step will cost less in the long term because you are taking a shorter, pre-planned route to your destination. Similarly, risk-adjusted profit will increase sooner and you will have locked in a sustainable competitive advantage for years to come.

Steve White is CEO of RiskCare, a London-based financial technology consultancy specializing in the design and building of systems for trading, risk management and e-commerce.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Systematic tools gain favor in fixed income

Automation is enabling systematic strategies in fixed income that were previously reserved for equities trading. The tech gap between the two may be closing, but differences remain.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here