Discover the intricacies of FRTB capital calculations, the most demanding and complex regulatory measures to date.
Are you prepared to explain your capital calculations in what is undeniably the most complex and demanding set of regulatory measures around Capital Requirements enforced to date?
In this article, we cover the core items at stake to comply with the regulators demand in terms of analytics and calculations for FRTB, some of which may be key to the Financial Institutions’ ability to explain and understand results in an intense and unprecedented trading environment.
This is a key topic for leaders and experts from the trading desks and the risk management office.
The countdown towards implementation of the Fundamental Review of the Trading Book (FRTB) has begun in earnest.
According to EY 2023 Basel 3 Global Reforms Survey, 25% of banks are spending over $100M to deliver FRTB, making it a substantial investment with high expectations on value delivered beyond compliance with the regulator. The same survey states that 48% of the average spending is on data and technology.
Starting later this year in the Asia-Pacific region, FRTB will require banks to adopt a more extensive and complex approach for capital calculation.
As already mentioned in our blog “Data Is the North Star for Navigating the Default Risk Charge Challenges Presented by FRTB”, banks can decide to use either the SA (Standardised Model) or IMA (Internal Model) Approach. Conventionally, while SA is simpler and easier to implement, IMA provides more accurate risk measurements but demands a higher level of sophistication and oversight.
Nonetheless as a result of either option, banks brace themselves to capture a much more granular level of data, meaning considerable higher volume of data points, as well as to derive valuable and explainable insights for better decision related to capital calculation, but also overhaul or at least evolution of the risk models and market risk practices.
Banks will need to provide two fundamental elements.
The first is accurate capital calculation. We are emphasising the gruelling challenge of achieving accuracy from an analytics perspective across so many variables, volatile products, complex portfolios across different asset classes, starting the impressive effort of data collection across so numerous sources and formats with uneven levels of integrity and traceability.
The second element relates to infrastructure investments the banks are facing in order to overcome the technical challenges of handling the immense data volumes especially from the Internal Model approach (in some cases up to 100 times larger than that which is typically dealt with today). You can watch our on-demand webinar “Action time on FRTB: Is your bank ready for the data challenge?” to explore the key data and analytics challenges as well as opportunities arising from FRTB implementation.
A lot has already been achieved in the FRTB journey, and most of the banks are now sharing initial results internally to front office users and risk managers.
From privately surveyed risk managers and trading desk heads across 10 European G-SIBs, the first feedback from both first and second line of defence is not about the numbers themselves but the absolute need to understand them.
Not a surprise?
To quote one of these trading desk heads, ”calculating is one thing but if you can’t explain, why calculate?”
To embrace any new risk and capital framework, there is an utter need to understand the behaviour of the various models. And it is true for both the SA and the IMA.
The idea that the Standardised Approach is simpler doesn’t mean that it is easy to understand.
As an example, the flexibility to move quickly from a firm aggregated capital number to a desk allocation (where multiple methodologies can be applied with their own bias) to the granular trade information, requires a strong and fast infrastructure.
Part of the complexity is due to the three correlation scenarios which are used in the SA calculation and which may create gaps when the portfolio generates a change from one scenario to another with significant impact to all the desks even for those without any new activity.
For the same reason, it is critical to have the capacity to observe the long term trends of the capital calculation not only at the aggregated level but also at the desk or even book level!
Last but not least, a sophisticated ‘what if’ solution is a must. Several simulations bring a lot of intelligence.
Let’s explore three common situations you will most certainly encounter:
For a desk head or a book owner, understanding the SA allocation being marginal, standalone or other methodologies, and the key contributors is highly valuable. Simulating on the fly the impact of moving a book from one desk to another, and the impact on the allocated capital is invaluable to the front office.
For a trader or risk manager in a bank using the SA, a pre-trade analysis of the marginal impact may avoid bad trading decisions. Let’s take an example of a book where an Interest Rate risk on a forward is explaining the majority of the allocated capital. The natural trading decision will be to enter into a Forward Rate Agreement (FRA) covering the targeted maturity. Simulating in real time the impact of this simple trade should not cause any ‘strange’ impact.But it did as this small trade reduced the offsetting effect of this desk on the overall portfolio and generated a change of scenario, which did significantly impact the overall capital calculation for the entire firm! It cascaded to the various desks having the opposite effect to the one which was initially targeted.
For banks adopting the IMA, which actually means combining the IMA and the SA, the decision (not linked to the Profit & Loss attribution PLA test) to move from one framework to another should be understood exhaustively. Having the capacity to simulate the overall impact of a (or a few) book(s) moving from the IMA to the SA (and vice versa) is the only way to avoid unexpected results without the capacity to reverse the change easily…and the simulation should not only be done on the current COB (close of business date) but on back testing a long term history in order to understand the average impact which may differ from a spot impact.
These are only three situations, but multiple other simulations are relevant combining back testing, stress testing and new activity.
However, performing such scenarios can only be relevant if the data they are relying on is trustable, centralised and easily accessible in real-time to evaluate and mitigate risk proactively. But only few platforms meet the technological prerequisites to deal with these data sizing and consumption constraints while being tailored for specific requirements.
Through Opensee’s technology, banks gain a complete view of their data, not only to produce daily FRTB reports, but to unlock actionable business intelligence on demand through accurate, flexible and fast analysis of their data.
It includes a rich simulation module, ensuring that front office and risk managers are (1) provided with the correct data, and (2) able to get insights from this data on an intraday, real-time basis by being able to examine, change or simulate data at a granular level and in an interactive way.
If you would like to learn more about our in-depth industry perspectives and technology expertise, check out our blogs.
About the author: Christophe Rivoire joined Opensee in 2019 as Senior Advisor before becoming Head of Strategy and UK Country Manager. Before joining Opensee, he spent more than 25 years in the Banking industry and especially in Fixed Income Trading in Europe and in the US. He started his career at Louis Dreyfus Finance before joining HSBC in 1998 where he held various responsibilities. Combining Risk, Trading, Technology has always been at the centre of his interests.