Editorial

The Future of Pricing and Risk Tech is Bright: Why Traditional models do not work

A high priority market-level regulatory change is the Libor replacement, due to take effect from January 2022 in most markets, alongside the first phase of the perennial Fundamental Review of the Tra

Contributor

A high priority market-level regulatory change is the Libor replacement, due to take effect from January 2022 in most markets, alongside the first phase of the perennial Fundamental Review of the Trading Book (FRTB). Just these two regulatory changes alone can drive a large volume of remediation into existing pricing and risk models and, depending on the business, may require entirely new models as well.

These are just two of the major regulatory overhauls financial organisations must adapt to in the next decade. But acting on these changes has become a challenge.

The problem is that most of the traditional pricing and risk models are built on outdated and fragmented architecture. This means that they are very expensive and time consuming to amend. Entirely new technology is not just recommended to comply with FRTB and similar upcoming regulations but required for an efficient and cost effective Pricing and Risk infrastructure. For financial services organisations to survive the next decade, they will need to change the way they think about modelling.

Traditional models just don’t work anymore

The fact is, the mathematical methods behind the current heritage pricing and risk models are decades old, and not well suited to making use of the latest computation technologies. Achieving consistency across models is hard, and given computational limitations, the results may not always be very accurate.

Consider this:

Traditional pricing and risk models are designed for a run in the siloes of desks (or asset classes) and for specific use cases. For example, let’s consider Future Contract A. To model this instrument, you would need all necessary data for your model, a product payoff, logic for calibrations and sensitivities, and the specific implementation of how to numerically solve (execute) the model.

Each instrument model has these software components, which means a large bank may have several separate libraries, each with up to thousands of these siloed models. Since each model has its own numerical solver, there is no way of ensuring full consistency in the mathematics between them.

New regulatory requirements also mean increased computational challenges. A demand for increased accuracy means that models need to become more granular with more time-steps, scenarios and consistent risk factors. Nested simulations may become necessary for some models. Few heritage solutions are capable of scaling up by, say, an order of magnitude in calculation volumes without significant changes and more infrastructure.

But it is even more complicated.

These pricing models are usually coded in C, which is a complex programming language for high-performance applications and requires quants to design and implement each model. A single model runs thousands of lines of code and requires a large grid of computing nodes to execute. This means a very significant investment in infrastructure and the IT support staff to operate it. Because each model is original and hand-coded, they are susceptible to bugs and human error.

Already we see that the traditional method is complex and expensive. You have an immense library of models, each with labour-intensive code, and a big team of quants and IT specialists dedicated to maintaining them. Not only is this process suboptimal, but it’s unnecessarily complex and slow to change.

And we haven’t even talked about the time-cost.

A pricing model for for a complex contract can take anywhere from a few months to a year to complete, depending on the model’s complexity. That’s not including approval time or creating challenger models.

New regulations encourage institutions to standardise these processes and promote consistency. But with the heritage systems, that is close to impossible. You would have to standardise, i.e. refactor, thousands of models for financial instruments built with the traditional programming model.

So when banks are looking to remain compliant, they need to find a solution that focuses on simplicity, consistency and a new level of computational capability. For this change, we must look at a more modern mindset.

The revolution of pricing and risk technology

Modern pricing and risk technology is more than an evolution. It’s a revolution. The architecture required to meet the future challenges is very different from most existing systems.

Imagine that rather than a massive model library, you have two primary components: A modelling environment and single numerical solver. Your modelling environment includes templates for models, product payoffs, and risk factors, along with a standard way to calibrate. And each model can be executed with a single numerical solver, through a process that compiles the model scripts written by the quant analysts and combining it with the relevant data for the model. The compilation can be targeted and optimised for a variety of hardware-accelerated infrastructures, ensuring efficient and fast execution and the ability to scale to very large portfolios.

A new platform can be built alongside your existing solutions and run in parallel to benchmark its results. You can start a process of migration away from the vast model libraries with millions of lines of code. The actual executable code you will have is written by a compiler, which reduces error and increases consistency. Quants can focus their time on the value-adding modelling and results analysis, rather than rebuilding the mathematical execution for every model by hand.

And instead of taking many months to create a new model, it may take mere hours or days. You can rapidly develop models, create numerous challenger models to fast track model validation, and receive accurate analytics. Rather than view simulations that look ten days ahead, you can go as far as hundreds of timesteps into the future, depending on your hardware.

From this, we can see that the latest pricing and risk technology architecture reduces cost and time but also improves consistency and accuracy, which is perfect for preparing for FRTB and other regulatory drivers.

Moving forward with the future in mind

At the end of the day, IBOR and similar colossal changes to financial services like FRTB encourage the much-needed changes in pricing and risk technology. But it will also be essential to stay alert and ready for any other regulation changes. That is one of the reasons having a single pricing and risk platform capable of handling all assets classes and use cases is essential.

New scripts can easily be integrated to adjust for any future changes to the pricing and risk process and algorithm. This kind of elegant architecture allows for future growth and gives you more control over the process—all without sacrificing more time, money, or manpower.

To find out more email us at marketing@deltacaptia.com.