The Revision to the CVA Capital Charge by the Basel Committee. Short Review and some critical Issues – Part II
di Michele Bonollo e Antonio Castagna

Gen 19 2016
The Revision to the CVA Capital Charge by the Basel Committee. Short Review and some critical Issues – Part II   di  Michele Bonollo e Antonio Castagna

Executive Summary: This is the second part of the discussion of the recent BCBS paper no.325 concerning a deep review to the CVA capital charge, that was introduced with the CRR (Basel 3) regulation. This revision aims to improve the CVA calculation with respect to the forthcoming fundamental review of trading book (FRTB), and to have a more risk sensitive approach for the standard approach. Nevertheless, we believe that in this first formulation some points are not well clarified or solved. In this part we analyze some drawbacks of the suggested new set up.

 1 The CVA current review. Purposes and Contents

In July 2015 the Basel Committee issued the consultative paper 325, see [1], concerning the revision of the credit value adjustment.

The main purposes can be summarized as follows:

  1. To capture all the risk factors in the CVA capital charge. The most relevant point here is to take in to account also the market risk factors that can affect the future value of the exposure. This is also due to the practices in the market, where also the CVA hedging deals are sensitive to these market factors
  2. To be compliant with the accounting principles. More specifically, the future exposure should be calculated with a market implied calibration of the parameters (i.e. underlying volatility) with a risk neutral approach. This is a strong new perspective, as currently in the CCR framework for the EAD estimation in the internal models the market calibration is not required.
  3. Alignment of the CVA to the new market risk framework. We refer mainly to the use of sensitivities, risk factors taxonomy and so on.
  4. A more risk sensitive approach for the non internal calculations.

A 3-types hierarchy of possible approaches was stated in this draft version of the paper, namely:

  • FRTB-CVA framework, that splits in:
  1. IMA-CVA = Internal model approach
  2. SA-CVA = Standardized approach
  • Basic CVA framework

The last approach is eligible for the banks that are not able to match the FRTB requirements or have not enough resources to implement such a project.

2 The new framework critical points. Part II

We contributed to the first consultative session, see [2]. In this section we highlight a more specific set of issues that in our opinion are not solved in a satisfactory way.

2.1 Specific calculation methodology

The paper seems to propose specific calculation methodologies for sensitivities (see par. 41 and 42 in [1]). Actually, more sophisticated methods to compute sensitivities, other than the brute force approach proposed, are available in theory and they have also been implemented in practice by some banks (e.g.: adjoints and automatic differentiation). Simple bumping the risk factors may not be the most effective way to compute sensitivities. See [4].

We suggest modifying the CVA framework to allow banks to choose the preferred numerical calculation method. In other words, we believe that the Committee should strictly prescribe only the functional mathematical definition of the indicator, allowing to the bank to select the optimal strategy to calculate it (numerical, simulation, etc.). See Bonollo et al. [3].

A similar issue arises in the EEt calculation for the EPE in CCR, where the theoretical definition (expected value in the future) is combined with the algorithm (a Montecarlo approach). We fear that this way to state the regulation could be misunderstood, since in practice the banks choose their own algorithmic strategy, combining ICT devices (GPU, grid, etc.) with mathematical tools (quasi Montecarlo, approximations, etc.).

2.2 Non-Captured Risks

The CD suggest a multiplier mCVA for the wrong way risk, if it is not properly accounted for in the bank’s methodology (see par. 32 and 33). Now, while we agree with a greater prudence in the assessment of the CVA for the un-accounted risks, we think that the CD focuses only on one of them, namely: the wrong way risk, without considering more relevant risks likely affecting the measurement in a more material way.

More explicitly the CD does not mention the errors due to the correlation matrix employed in the calculations, when it does not reflect the actual future matrix. Correlations are rarely found quoted in the market and only a few contracts can be traded to hedge the correlation risk. Since the CVA,  as far as its hedging is considered, can be seen as a very complex hybrid derivative contract (especially if netting sets are cross-asset), a wrong correlation matrix implies that the second order sensitivities (Cross-Gamma and Cross-Vega) cannot be soundly hedged, and this would entail a mis-hedge also for the linear Greeks (i.e.: Delta and Vega) which may eventually result in a global increase of the P&L volatility, contrarily to the supposed minimization due to the hedging.

We suggest including in the CVA framework also an assessment of the risks related to the correlation matrix, when correlations cannot be easily traded in the market, or they cannot be traded at all. This is very likely the most common situation in the current markets.

As a general consideration, we doubt about the effectiveness of the CVA hedging (i.e.: replication) in practice. We would prefer to treat the CVA earned on the deals closed by the bank as an actuarial premium, rather than a derivative exposure to be synthetically replicated. The volatility of the CVA can surely absorb capital but the effectiveness of the hedging, set up to reduce it, should be carefully evaluated and put under stress.

2.3 FRTB CVA and Double Counting

The CVA framework relies on the changes of the Fundamental Review of the Trading Book, yet it is quite independent from the calculation of the regulatory capital for the market risk. We are aware that the Expected Shortfalls (ES) for the market risk and for the CVA volatility are computed out of two quite different approaches, and that the latter needs a simulation up to the expiry of the longest contract which is not strictly needed to determine the market ES. Nonetheless, it is obvious that the positive exposures increasing the counterparty risk could be also compensating a decreasing risk on the market risk side, due to the positive impact on the NPVs.

The CVA framework should allow banks with sophisticated skills and strong IT computational capabilities to measure jointly the ES on the market and counterparty risks, so that possible compensation of risks are properly identified and measured.

 2.4 The Risk Measure and Aggregation in the FRTB SA-CVA

The computational workflow

The SA approach follows the general strategy of making the standardized model more risk sensitive, by dealing in a rigorous way the key concepts such as risk factors, sensitivities, dependency/correlation structure. Generally the parameters implied by this set-up are assigned by the Committee.

All the non-internal-models banks will be obliged to work intensively to switch from very simple standard models that do not require sophisticated mapping and calculation procedures to the new SA. We refer mainly to Market, CCR and CVA capital charge.

In this framework, we think that the SA models should be more homogenous in their “architecture”, to avoid that small-medium banks make some confusion in managing and calculating these new measures. Otherwise they are obliged to maintain at the same time two or more systems for mapping and categorizing their risk factors.

As a simple example, let us compare the new SA-EAD for the CCR (paper 279) with the current SA-CVA. For the sake of simplicity, we refer briefly to the “Equity” asset class:

  • For CCR (EAD) purposes, a single risk factor model is prescribed with just one hedging set. Hence the full offset is allowed within the same reference entity, while a correlation factor is assigned with respect to the systematic factor, 50% for single names and 80% for the indices
  • For the CVA, i.e.: for capturing the (equity) exposure volatility, the workflow consists of 10 buckets given by a sector/geography taxonomy. The Delta and Vega exposure CVA sensitivities are then calculated with a cross correlation rij of 15% between all couples of buckets.

Then we could easily build a counterexample, e.g.: a portfolio of 2 equity derivatives belonging to different buckets, where the price joint movements of the 2 entities are taken in to account differently for the PFE and for the CVA-Exposure effect respectively.

Correlation coefficients and Risk Weights

In the spirit of the SA calculation, we generally agree with the general workflow.

On the other hand, we suggest improving some of the current parameters value.

There are several examples, but for brevity we point out just some simple cases:

  • The risk weight RW for Delta risk for FX is 15%, while the lowest RW for the Equity risk class is 30%. As well known, the RW role is to move from a what-if measure (the sensitivity of the instrument to the risk factor) to the instrument volatility. From this set-up one could argue that the highest volatility FX rate is 2 times lower than the lowest equity volatility.  We claim that this is not a realistic picture of the market price volatilities
  • The correlation cross buckets for the FX is 0.6, for the Equity is 0.15. Again, we find it not very accurate. It is often observed in the financial markets that the sectors move together inside a macro area, irrespectively of the size of the firms. In some cases, such as the buckets (1,2,3,4) vs bucket (9), i.e.: large cap vs. small cap in emerging markets (see Annex 1.B.2), a 0.15 coefficient is too low and not conservative. On the other hand, a 0.6 “flat” between the currencies in some cases is too high, also in a conservative perspective.

2.5 Computational Burden for the FRTB IMA-CVA

The CD proposes to compute the CVA capital charge daily, under different assumptions on the MPOR (par. 13), liquidity horizons (par. 85) and, above all, separately and jointly for all the relevant risk factors (par. 86, 87, 88, 89). On the one hand, this is a huge increase of the computational burden with respect to the current requirements to calculate the IRC; on the other hand, even more sophisticated banks calculating the CVA for the trading desks managing it, likely do not operate daily so many computations as those implied in the proposal.

IT technology is certainly available to perform the required computations on a daily frequency, but we suspect that the investments needed to upgrade existing systems would be massive even for more advanced institutions.

We are not trying to minimize the complexity and the subtleties of the risks involved with the CVA (as the point above on the correlation shows). We want simply to point out that a daily calculation represents a too high a frequency for most of practical purposes. In our view, it would be better to relax the frequency in favour of a deeper analysis of the model risks even beyond those explicitly considered in the CD.

References

[1] Basel Committee on Banking Supervision (2015), “Review of the Credit Valuation Adjustment Risk Framework “, available at http://www.bis.org/bcbs/publ/d325.htm

[2] Bonollo M. And Castagna A. (2015), “Comments to the Consultative paper Review of the Credit Valuation Adjustment Framework “, available at http://www.bis.org/bcbs/publ/comments/d325/iason.pdf

[3] M. Bonollo, L. Di Persio, I. Oliva, A. Semmoloni. “A Quantization Approach to the Counterparty Credit Exposure Estimation”. Available at www.ssrn.com, 2015

[4] L. Capriotti. Fast greeks by algorithmic differentiation. Journal of Computational Finance, 14(3): 3-35, 2011.

Share

I commenti per questo post sono chiusi