Dic 072018
 

“Può una macchina pensare come un essere umano? Molti dicono di no. Il problema è che è una domanda stupida. È ovvio che le macchine non possono pensare come le persone. Una macchina è diversa da una persona e pensa in modo diverso. La domanda interessante è poiché qualcosa pensa diversamente da noi vuol forse dire che non sta pensando? “

Nel 1950 Alan Turing cercava di spiegare come un computer potesse comportarsi come un essere umano. La sua teoria “il gioco dell’imitazione” apriva la pista a quello che circa mezzo secolo più tardi avrebbe caratterizzato il processo di trasformazione dell’economia in industria 4.0 basata, cioè, su una produzione industriale del tutto automatizzata e interconnessa.

Proprio nella teoria di Alan Turing, risiede il principio di funzionamento dei Chatbot che ad oggi costituisce un fenomeno in ampia crescita e che, secondo un’analisi condotta da Gartner, tenderà ad aumentare ancora entro il 2020[1].

 

Cosa sono i Chatbot

I Chatbot sono software progettati per avere una conversazione con un utente attraverso messaggi di testo o vocali (c.d. NLP-Natural Language Process).

Alla base del funzionamento dei Chatbot ci sono algoritmi di Intelligenza Artificiale, una disciplina che comprende teorie e tecniche rivolte allo sviluppo di macchine in grado di svolgere compiti e azioni tipici della intelligenza umana.

Ciò che distingue i diversi prodotti di Intelligenza Artificiale sono i modelli di apprendimento1 che possono essere principalmente distinti tra Machine Learning e Deep Learning.

Il Machine Learning comprende i metodi con cui le macchine riescono ad apprendere come compiere delle attività, ad esempio, attraverso l’analisi dei risultati e la correzione degli errori del proprio comportamento precedente.

Il Deep Learning, invece, tende proprio a emulare la mente umana attraverso la programmazione di reti neurali, ispirandosi al funzionamento dei neuroni biologici nelle fasi di apprendimento e riconoscimento.

 

Come, Dove e Quando si applicano i Chatbot

Come – I Chatbot si applicano ogniqualvolta viene ricercata una informazione sia questa un codice, un dato una procedura etc.. Tanto più la richiesta è chiara e dettagliata quanto più sarà rapido e preciso il Chatbot a fornire la risposta. La comprensione della domanda e dell’intento sottostante riveste quindi un ruolo cruciale. Originariamente i Chatbot nascono con le risposte organizzate in percorsi logici e associate tramite Machine Learning a una o più parole chiave, se nella domanda è presente la parola chiave il Chatbot identifica il percorso da seguire per arrivare puntualmente all’informazione richiesta. Spesso un percorso ha più diramazioni ed è allora che il Chatbot pone una domanda utile a raccogliere ulteriori elementi e prendere la “strada giusta”. È importante tenere a mente che tutti gli input che un utente da in pasto ad un Chatbot concorrono ad aumentare la capacità degli stessi di riconoscere l’intento sottostante ogni domanda grazie al Deep Learning sui dati storici. La crescente mole di dati ha accelerato il processo di evoluzione dei Chatbot che riescono oggi a comprendere il significato della domanda senza passare dalle parole chiave.

Dove – Sempre più spesso sui siti internet compare l’icona per chiedere informazioni via chat in aggiunta ai contatti telefonici e mail. A presidio di questi canali di comunicazione vengono solitamente utilizzati i Chatbot per vantaggi di economicità, efficienza e qualità rispetto ad un approccio tradizionale:

  • la diffusione di internet implica che utenti sparsi per il mondo possano accedere ad un sito in qualunque momento e deve quindi essere garantita copertura 24/7 per eventuali richieste di supporto;
  • nella maggioranza dei casi le richieste di informazioni riguardano tematiche ricorrenti la cui risoluzione può essere gestita in automatico in modo che gli operatori in carne ed ossa possano dedicarsi alle casistiche più particolari/complicate;
  • l’acquisizione e aggiornamento delle conoscenze necessarie a garantire risposte affidabili e tempestive avviene mediante un processo iterativo di continuous improvement diversamente dai Customer Care tradizionali dove gli operatori devono essere formati, costantemente aggiornati e sostituiti in caso di assenza o dimissioni.

Quando – Nell’era di Google siamo tutti abituati a ricercare/ottenere risposta in pochi secondi e nessuno è più disposto ad attendere, o peggio ancora, ad essere messo in attesa per avere un’informazione. Si è di fatto creato un benchmark con cui misurare i tempi di risposta di qualsiasi richiesta di supporto sia su canali telefonici che digitali. I Chatbot permettono di gestire le comunicazioni in modalità “botta e risposta” tenendo alto il livello di ingaggio dell’utente che spesso non si rende neanche conto di interagire con un bot. Ma fino a che punto è lecito utilizzare i Chatbot all’insaputa dell’utente? Noi siamo dell’avviso che un utente in cerca di un’informazione sia interessato ad avere una risposta precisa e puntuale piuttosto che curarsi del metodo utilizzato per fornirla. In fin dei conti se il risultato di una moltiplicazione è giusto, a chi importa sapere com’è stato calcolato?

 

I Chatbot nel settore finanziario

Il ricorso ai Chatbot rappresenta un trend in crescita giustificato, oltre che dalla profonda trasformazione digitale che sta travolgendo tutte le Industry, dalla necessità di trovare nuove modalità di comunicare con Millennial e Digital Native.

A livello mondiale sono molte le Industry che stanno intervenendo soprattutto sull’ambito Customer Care dove ogni anno vengono spesi circa €1.300 miliardi per gestire oltre 265 miliardi di richieste[2]. Da uno studio sul mercato US[3], si stima che i Chatbot permetteranno di ridurre il costo del Customer Care del 30% generando benefici per il cliente finale e per l’organizzazione grazie alla velocizzazione dei tempi di risposta e alla riduzione del backlog. Nell’ambito dei Financial Services, il potenziale dei Chatbot non si limita ad evolvere i canali di comunicazione esistenti ma permette alle istituzioni di creare nuove modalità di interazione con la clientela facilitando il percorso di trasformazione dall’erogazione di servizi finanziari a piattaforma accessibile 24/7 a supporto di molteplici esigenze non più solo finanziare.

La clientela deve essere “educata” a relazionarsi con una Banca a portata di click attraverso un linguaggio diverso dalla classica terminologia bancaria in modo da stimolare la propensione all’utilizzo delle funzionalità digitali. I Chatbot rivestono un ruolo chiave nell’accelerare il processo di sviluppo della cultura finanziaria dei clienti, possono infatti agire anche in modalità proattiva ad esempio segnalando eventi che intervengono sul conto corrente (es. accredito di una fattura attiva) per “catturare” l’attenzione e proporre poi ulteriori azioni (es. trasferimento su deposito vincolato) volte ad ottimizzare la posizione.

In definitiva i Chatbot assolvono l’arduo compito di supportare il cliente per semplificare un’operatività bancaria mediamente complessa stimolando una gestione attiva della propria situazione finanziaria.

 

Conclusione

I Chatbot rappresentano un punto di non ritorno nel rapporto uomo-macchina, fino ad oggi l’accesso alla tecnologia presupponeva la presenza di competenze sempre più elementari (es. bambini di 4 anni abilissimi utilizzatori di Ipad) ma comunque necessarie (es. per vedere i cartoni animati bisogna saper accedere all’applicazione). D’ora in avanti l’interazione uomo-macchina evolverà con dinamiche molto più simili a quelle sociali, così come l’essere umano impara dagli errori (commessi direttamente o tramandati dalla storia) così anche le macchine impareranno dagli errori dell’uomo. È bene ricordare che i Chatbot accumulano enormi moli di dati dalle interazioni con gli utenti e che una quota parte significativa è fisiologicamente errata ma grazie al Deep Learning le anomalie con il tempo vengono identificate e isolate permettendo alla macchina di imparare come irrobustire il proprio modello di conoscenze. Non è un futuro troppo lontano quello in cui i Chatbot sapranno prima di noi cosa stiamo per chiedergli e ci forniranno la risposta ancor prima della domanda.

 

Giacomo Mazzanti – Director Deloitte Consulting

Nicole Vismara – Manager Deloitte Consulting

Sonia Salotto – Consultant Deloitte Consulting

 

Note

[1] “Cos’è l’Intelligenza Artificiale, perché tutti ne parlano e quali sono gli ambiti applicativi”, AI for Business, Agosto 2018

[2] “How chatbots can help reduce customer service costs by 30%”, IBM, October 2017

[3] “The chatbots explainer”, BI Intelligence, 2016

Nov 242018
 

The Solvency II directive requires a review of the long-term (LTG) measures and the measures on equity risk until end of 2020.*

The LTG measures are:

  • the extrapolation of risk-free interest rates,
  • the matching adjustment (MA),
  • the volatility adjustment (VA),
  • the extension of the recovery period in case of non-compliance with the Solvency Capital Requirement,
  • the transitional measure on the risk-free interest rates (TRFR)
  • the transitional measure on technical provisions (TTP).

The equity risk measures are:

  • the application of a symmetric adjustment mechanism to the equity risk charge (SA),
  • the duration-based equity risk sub-module.

783 insurance companies on the European Economic Area uses at least one of these measures, 74% of technical provisions: 730 (66%) use VA, 163 (25%) use TTP, 38 (15%) use MA. 46% of all life insurance undertakings in Europe use VA.

There are eight countries that do use any measure (mostly small central east countries, including Poland).

Removing MA, VA, TRFR, TTP, technical provisions at the European level would go up by €215 bn, Own funds would reduce by €164 bn, the SCR would go up by €73 bn. The average impact in terms of SCR ratio for undertakings using at least one measure at European level would be -69%: Germany (-113%), UK (-108%), Denmark (-80%), Spain (-76%), Portugal (-66%). For Italy it would be almost insignificant: -9%. 11% of undertakings using these measures would be below the SCR and 5% below the MCR.

On average, undertakings that apply the measures MA, VA, TRFR or TTP hold bonds of lower credit quality (in particular Corporate bonds and Government bonds) than undertakings that do not apply any of these measures. Also the duration is longer (in UK is more than the double).

MA is only used in Spain and UK (53% of technical provisions), the average advantage in terms of bps in UK is 110 yielding an advantage of 89% on the SCR ratio. 42% of the companies (in Spain and UK) using the MA would be below 100% without it.

In ten countries (including UK but not Italy), the approval by the authority for the application of the VA is needed. In Italy 95% of technical reserves use the VA, only 35% in UK. The advantage in terms of SCR ratio for undertakings using the VA is mostly for Denmark (80%), Germany (53%), Netherlands (49%), quite limited in Italy and UK (9 and 6%, respectively). The average at the European level is 24%. 3% of undertakings (20) at European level using the VA would be below 100%.

The size of the VA as at end of 2016 for the Euro (it applies to all Euro-countries) is 13 bps (22 in 2015). In UK it was 30 bps. In four countries (Greece, Italy, Portugal, Spain) the country spread is higher than the double of the currency spread but below 100 and therefore it doesn’t provide a contribution (country-VA).

In UK, 54% of technical provisions adopt TTP (mostly Life) with a 47% improvement on the SCR ratio, the effect for all the companies at the European level adopting the TTP is 88% with a huge advantage for French undertakings (139%), Germany (233%), Belgium (135%) and Spain (107%). 26% of undertakings (43) at European level using TTP would be below 100% (13 in Germany and UK, 10 in Portugal).

The market share in technical provisions of undertakings using TRFR is negligible. The effect of equity risk measures is negligible.

169 undertakings adopted TTP or TRFR, of these 60 were required to provide a phase in plan: retention of profits or earnings (27), raising new capital (16), reduction of risk profile (20), change of product design (10), reduction on expenses (17).

It emerges that Italian undertakings do not take advantage in a significant way of Long Term Guarantees measures. Instead there are countries (Germany, UK, Demark, Spain, Portugal, Greece, Norway, The Netherlands) where these measures help significantly to cope with regulatory constraint.

 

*EIOPA (2017) Report on long-term guarantees measures and measures on equity risk.

Nov 152018
 

Upcoming challenges and new trends for financial institutions

Credit risk management in the banking industry has changed in recent years, mainly as a consequence of the stricter regulations following the financial crisis; further changes, whose magnitude and effects are mostly not predictable, will have an impact in the next decade on its role, scope and organization.

The definition, development and implementation of the interventions needed to keep pace with these changes will require important investments by banks in terms of adoption of new technologies, redefinition of processes and organization, and will imply the need to overcome several challenges. These investments are expected to be compensated by economic returns in the medium-long run, through the possibility to re-allocate staff to more valuable activities, to provide managers with automatic and more comprehensive flows of information fundamental for their strategic decisions and to obtain capital savings thanks to more predictable internal models.

The main trends currently affecting the risk management function, deemed to have an even more important impact in the near future, are related to regulatory topics, digitalization and practices optimization, business development and new risks prevention.

Evolution in the regulatory framework

Financial institutions have recently been facing more stringent regulation therefore significantly expanding their risk management functions. Among the latest regulatory developments affecting credit risk, in December 2017 the legal provisions revising and integrating the Basel III framework, frequently referred to as ‘Basel IV’, have been launched by the Basel Committee. The new dispositions mainly aim to further reduce the variability in the measurement of the RWAs among banks with different dimensions, operating according to diverse regulatory frameworks and business models.

Besides ‘Basel IV’ reforms, in November 2017 the EBA published specific guidelines focused on modelling techniques for the estimation of IRB parameters for defaulted and non-defaulted exposures.

Additionally, it is worth mentioning the process currently being structured by the ECB in order to address the implementation by banks of the new definition of default. The new regulatory framework aims to harmonize the criteria of identification of the default status at European level, therefore minimizing the variability of the RWAs.

Finally, the digitalization process of banking processes currently underway will increase the new regulation issued for the purposes of governing and control this new fields.

Digitalization and practices optimization

Banks business and operational models have evolved in the last decade due to the process of digitalization; it implies the transformation of existing processes by leveraging on the application of digital technologies and data in order to create new value and opportunities. Digitalization represents for the banking industry and risk management the most effective way to reduce costs in a context of persistent margin decline, which is a direct consequence of:

  • competition of aggressive FinTechs and banks early-adopting new technologies, with low-cost business models and automated processes, enabling them to provide customers with different kind of offerings;
  • low interest rates condition affecting the whole industry;
  • increasing regulation, which caused the growth of risk management functions in terms of staff and costs in the last years.

Among tools useful for the risk management function to successfully compete in this evolving framework, advanced analytics and big data systems have already started to prove their effectiveness:

  • advanced analytics: new technological and statistical tools (e.g. machine learning) capable to identify complex patterns in richer datasets, enabling the estimation of more accurate and predictive internal models and the reduction of credit losses;
  • big data systems: data processing software enabling the analysis of greater amounts of structured and unstructured data in a faster way, made possible by the increased computation power of modern technologies.

 Business development

Technological innovation led customers to increasingly demand digital banking services, to be accessible at anytime from any devices, in order to support their everyday decisions. Social media and e-commerce have acquainted clients by means of personalization and brisk fulfilment of their requests. Consequently, it is fundamental for financial institutions to prioritize their digitalization efforts in order to be abreast of the rapid developments in this area. The challenge for banks over the coming years is to be present in key moments of customers’ life, anticipating and satisfying their financial needs while foreseeing variations in their purchasing preferences.

With this aim, advanced statistical algorithms and artificial intelligence systems applied to risk management models and processes will play a fundamental role in meeting customers’ expectations and facilitating business development, anticipating clients’ needs and providing customized solutions. Consequently, the risk function will be called upon to collaborate jointly with each business in order to respond to customers’ demands while limiting risks mainly related to the complexity of supporting processes.

 New risks prevention

Beside regulatory risk types such as credit, liquidity, market and operational risk, specific non-financial risks are emerging as result of structural changes concerning financial institutions, including models’ greater complexity, the introduction of digitalization and automation in many processes, as well as the growth of interconnectedness among market players. In this light, new regulations are progressively addressing additional risk types, which are not properly new, but due to their growing importance and their impact on the financial system, deserve accurate management.

Among others, models’ increasing complexity stemming from the adoption of advanced analytics techniques entails the so-called model risk, occurring when a model performs inadequately. It usually derives from underlying data quality or data management issues resulting in misleading outcomes and operational losses; or due to the incorrect model estimation and execution intended as technologies and processes provided to end users to effectively conduct daily operations. To this extent, the optimization of model risk management is becoming a core part of risk activities.

 Regulatory expectations on the role of FinTechs

In light of the cross-sectoral transformation of the financial industry, the main Supervisory Authorities are carrying out exercises in order to gather information about the range of financial services provided and innovations applied by FinTechs. The main purpose of the Supervisors is to define the related regulatory treatment and the main areas of intervention, focusing on the following aspects: (1) accessibility of financial services to customers, (2) bringing down operational costs and increasing the efficiency of the financial services sector, (3) enhancing competition in the Single Market by lowering barriers to entry, and (4) balancing greater data sharing and transparency with data security and protection needs.

The main results gathered so far made it possible to identify follow-up initiatives and a roadmap at European level for the following years. In particular, regulators’ expectations are focused on the promotion of technological developments in order to allow opportunities in the FinTech perimeter, while it is of primary importance to ensure consumer protection, as well as integrity of financial markets through sector-specific regulation to be issued.

 Impact on credit risk management

In light of the latest evolutions in the regulatory framework, banks are facing, at the same time, an increase in compliance costs – financial institutions will be required to plan an impactful revision of their credit risk-measurement models, as well as of the related internal processes and IT systems – and a reduction in returns due to the raise in capital and liquidity requirements.

On the other side, digitalization represents the opportunity for the banking industry to reduce costs through the use of advanced data analytics and big data systems, enabling the provision of more accurate and performing internal models, leveraging on the automation in order to speed up their development timings and simultaneously reduce the need for manual inputs. To this end, the adoption of new technologies will certainly require the involvement of professional resources characterized by widened analytical-oriented skills.

Additionally, advanced statistical algorithms applied to credit risk management models and processes will play a fundamental role in meeting customers’ expectations and providing customized solutions, likely facilitating business development and making banks more competitive on the market, where FinTechs have become a player of interest for clients. Digitalization will facilitate banks in the processes implying the use of internal models, such as – for instance – credit granting, management reporting and pricing policies. To this purpose, banks will be required to reshape their credit risk management functions, which will be called upon working alongside with several structures, such as business, operations and finance departments; this collaboration will in turn spread and enhance credit risk culture among other strategic areas.

The effectiveness and timely response of banks to the above mentioned trends, and their ability to adapt business models and processes to the evolving environment, will determine their future competitive success. In this context, credit risk management will have the chance to play an important role as one of the leading functions in banks’ strategic change.

 

Antonio Arfè – Partner Deloitte Consulting

Francesco Zeigner – Partner Deloitte Consulting

Vincenzo Maria Cosenza – Senior Manager Deloitte Consulting

Nov 112018
 

Lo scorso 2 novembre l’EBA ha diffuso i risultati della nuova tornata di stress test condotta durante il 2018 a partire dai dati di bilancio relativi alla fine del 2017. Il test ha riguardato 48 banche operanti in 15 paesi europei ed ha fornito indicazioni sui coefficienti (attesi) di patrimonializzazione nello scenario base e in quello avverso per un orizzonte temporale che arriva al 2020. Per l’Italia sono state considerate 4 banche: Unicredit, Intesa-Sanpaolo, Ubi e Banco BPM.

Come per gli stress test condotti nel 2016 l’intento dell’EBA non è quello di decretare in modo esplicito gli istituti che falliscono il test quanto quello di utilizzare le informazioni per il Supervisory Review and Evaluation Process (SREP) al fine di definire i requisiti aggiuntivi di capitale richiesti alle singole banche per tener conto del loro grado di rischiosità.

L’assenza di una lista di “vincitori e vinti” ha permesso a molte banche di dichiararsi tra quelle che hanno superato meglio il test, un po’ sulla falsa riga di Donald Trump che ha sbandierato una vittoria al Senato nelle elezioni di midterm omettendo l’amissione della sconfitta alla Camera dei rappresentanti.

I dati presentati nel report dell’EBA si prestano infatti a diverse interpretazioni a seconda dell’indicatore di patrimonializzazione considerato e della grandezza osservata. Prendendo a riferimento il Common Equity Tier1 (CET1) ratio fully loaded (ovvero il ratio che incorpora gli effetti a regime della piena implementazione della Capital Requirements Regulation, della Capital Requirements Directive IV e del principio contabile IFRS 9) e il leverage ratio, sempre fully loaded, si ha una diversa fotografia. Il primo indicatore (CET1 raio) è influenzato dal modello di business della banca, dal livello di rischio degli assets e dall’utilizzo dei modelli interni per la valutazione del rischio. Il secondo indicatore (leverage ratio), avendo al denominatore una grandezza non ponderata per il rischio, ha il pregio di non risentire degli effetti della potenziale manipolazione delle attività ponderate per il rischio (Barucci e Milani, 2018).

 

Grafico 1. Effetto dello scenario di stress al 2020

Fonte: elaborazioni BEM Research su dati EBA.

 

Nel grafico 1 è riportato l’effetto al 2020 sui due coefficienti di patrimonializzazione dello scenario di stress in termini di basis points. Le 48 banche considerate sono state aggregate in base al paese di origine. Dal grafico emerge che sulla base del CET1 ratio il Regno Unito e la Germania sono i due sistemi che subirebbero i maggiori contraccolpi nello scenario avverso. In media le banche inglesi vedrebbero ridursi il loro CET1 ratio di oltre 600 punti base (bp), quelle tedesche di circa 560. Pesante sarebbe anche l’effetto sulle banche irlandesi, finlandesi e danesi (circa 500 bp). Le banche italiane, invece, avrebbero un impatto di circa 350 bp, che si colloca leggermente al di sotto della media complessiva (370 bp).

Sulla base di queste evidenze alcuni giornali italiani hanno titolato sottolineando il fatto che le banche italiane siano tra le “vincitrici” dell’ultima tornata di stress test. Questa interpretazione racconta solo una parte della storia. Basta guardare al dato circa l’impatto dello scenario avverso sul leverage ratio per trovare una fotografia in parte diversa. Sulla base di questo indicatore sono le banche irlandesi quelle più colpite (200 bp), seguite dalle finlandesi (150 bp), dalle austriache (140 bp) e dalle inglesi (130 bp). In questo caso le banche italiane, con un impatto negativo medio di 115 bp, si posizionano al di sopra della media europea (90 bp), ad un livello non molto distante da quanto osservato per gli istituti tedeschi. Ci possiamo rasserenare per il fatto che il primo pilastro della regolamentazione si basa pur sempre sul CET ratio ma il dato non può essere trascurato.

Ancor meno rassicurante è il dato che emerge osservando il livello medio di CET1 ratio e di leverage ratio ottenuti negli scenari di stress (grafico 2). In questo caso le banche italiane sono in media tra quelle che evidenziano, per entrambi gli indicatori, coefficienti tra i più bassi in Europa.

 

Grafico 2. Coefficienti di patrimonializzazione al 2020 nello scenario stressato

Fonte: elaborazioni BEM Research su dati EBA.

 

Guardando al dettaglio per singola banca si rileva che sono in particolare Banco BPM e UBI ad evidenziare livelli contenuti sia sul CET1 ratio sia sul leverage ratio (grafici 3 e 4).

Leggendo assieme il grafico 1 e il grafico 2 possiamo dedurre che le banche italiane sono sì meno esposte delle altre europee ai rischi macroeconomici ma sono ancora sottocapitalizzate: quindi una variazione non elevata in termini di capital ratio le porta ad un basso coefficiente di patrimonializzazione. A ben guardare è la Gran Bretagna ad essere messa male sotto ambedue i profili.

Da notare in particolare come il leverage ratio, nello scenario stressato, scenderebbe al di sotto della soglia del 3%, prevista da Basilea 3, in diverse realtà bancarie europee, tra cui anche Deutsche Bank e Banco BPM. Solo 21 tra le 48 banche considerate avrebbero invece nel 2020 un leverage ratio post stress test al di sopra del 5%, la soglia imposta dalla Federal Reserve e dal Federal Deposit Insurance Corporation (FDIC) alle banche operanti negli Stati Uniti.

 

Grafico 3. Fully loaded CET1 ratio al 2020 nello scenario stressato e scarti rispetto alla baseline

Fonte: elaborazioni BEM Research su dati EBA.

Grafico 4. Fully loaded leverage ratio al 2020 nello scenario stressato e scarti rispetto alla baseline

Fonte: elaborazioni BEM Research su dati EBA.

In definitiva, i risultati degli stress test per il 2018 non sembrano essere così rassicuranti come qualche commentatore ha evidenziato. Oltretutto occorre rilevare che, come spesso è accaduto in passato le ipotesi adottate dall’EBA per disegnare gli scenari degli stress sono state superate dalla realtà in due direzioni. Al riguardo occorre notare che per l’Italia lo scenario avverso ha ipotizzato un calo consistente del Pil (7 punti percentuali cumulati fino al 2020) ma con uno spread BTP-Bund pari a 250 bp (Milani, 2018), quando invece nel periodo recente ha toccato quota 340 bp oscillando su valori intorno ai 300 bp.

Le notizie ‘‘positive’’ vengono soprattutto dal fatto che la performance ‘‘negativa’’ in termini di variazione del capital ratio delle banche inglesi, tedesche e dei paesi nordici è dovuta al fatto che finalmente gli stress tests sono in grado di ‘‘stressare’’ gli asset di livello II e III, che sono presenti soprattutto nelle banche dei paesi del nord Europa. Si tratta di titoli complessi, difficili da valutare che sono state all’origine della crisi finanziaria ed erano stati ignorati dall’EBA nelle analisi precedenti, si veda al riguardo Barucci, Baviera e Milani (2018). D’altro canto, la pulizia dei bilanci delle banche italiane dei NPL ha sicuramente avuto un effetto positivo nel renderle meno rischiose.

Bibliografia

  • Barucci E., R. Baviera, C. Milani, The Comprehensive Assessment: What lessons can be learned?,  The European Journal of Finance, 2018.
  • Barucci E., C. Milani, Do European banks manipulate risk weights?, International Review of Financial Analysis, Volume 59, pp. 47-57, North-Holland, 2018.
  • Milani C., Le principali caratteristiche degli stress test 2018, finriskalert.it del 12 febbraio 2018.
Ott 192018
 

La letteratura economica ha posto un focus maggiore sui problemi connessi con il rischio di credito, ponendo dall’altro lato una minore attenzione al rischio di mercato. Al riguardo va anche ricordato che il primo Accordo di Basilea si focalizzava esclusivamente sulla prima tipologia di rischio. Seguendo questa linea di pensiero, la vigilanza bancaria europea si è concentrata maggiormente sull’analisi del rischio di credito e sulle sue determinanti, con particolare interesse ai non-performing loan (NPL).

Esemplificative sono le dichiarazioni di Danièle Nouy, presidente del consiglio di vigilanza della BCE, di fronte al Parlamento Europeo: “Level 2 and Level 3 positions consist, to a large degree, of hedging and client-related transactions providing financial services to the real economy and satisfying a demand from various economic agents”. In altri termini, i rischi sottesi da queste forme di attività finanziaria non sono ritenuti eccessivi. Ma qual è il giudizio dei mercati al riguardo?

Una possibile risposta è stata fornita nel Rapporto Banche CER 1/2018. Per misurare il grado di rischio di una banca è stato considerato un indicatore molto diffuso nella letteratura economica, ovvero lo Z-score. Quest’indice misura quanto una banca è distante dal default. Ciò che si nota, nel periodo compreso tra il 2001 e il 2016, è che le banche che appartengono al cluster a basso rischio hanno, sia nel periodo pre-crisi che post-crisi, una maggior concentrazione delle loro attività nel tradizionale business creditizio, mentre di converso hanno una minore incidenza delle attività finanziarie (tavola 1). Le banche con rischio medio sono invece quelle meno attive sul credito e più propense ad investire in attività finanziarie, anche con finalità di trading. Le banche ad alto rischio si situano a metà degli altri due cluster per quanto riguarda la specializzazione sul mercato del credito e la dotazione del portafoglio finanziario. Rispetto agli altri due gruppi, però, l’incidenza dei non-performing loan è ben più elevata, sia nel periodo pre-crisi sia ancor più nel post-crisi.

 

Tavola 1. Banche europee quotate (distribuzione del rischio in base al modello di business)

Note: cluster analysis basata sui valori mediani.

Fonte: elaborazioni CER su dati Thomson-Reuters.

Da questa prima analisi emerge, quindi, come problemi sulla stabilità finanziaria delle banche possono nascere anche, e soprattutto, dal rischio di mercato e non prevalentemente dal rischio di credito.

Tra le attività finanziarie che possono determinare elevati rischi per le banche europee ci sono le cosiddette attività di 2° e 3° livello (level 2 e 3 asset), tipologie di attività finanziarie caratterizzate da una forte opacità e complessità. I level 2 e 3 asset non possono infatti essere valutati direttamente osservando prezzi di mercato ma il loro valore deve essere desunto analizzando l’andamento di asset similari e/o mediante l’utilizzo di articolati modelli interni di autovalutazione. Guardando ai dati raccolti da Mediobanca relativamente alle banche europee di maggior dimensione, emerge come Deutsche Bank sia la banca con maggiore presenza di level 2 e 3 asset in portafoglio (42% del totale attivo nel 2016). Seguono RBS, e Barclays. Le banche italiane non sono particolarmente propense ad investire in strumenti finanziari complessi. Sia Unicredit che Intesa-SanPaolo ne detengono un percentuale sul totale attivo tra le più basse del campione.

Le banche europee hanno continuato ad investire in titoli finanziari speculativi. Le attività complesse hanno infatti permesso di generare un risultato positivo dal trading, ma non tale da compensare la flessione del margine d’interesse. Il risultato finale è dunque una perdita di redditività per le banche che hanno investito in attività finanziarie speculative.

L’approccio tenuto dalle grandi banche europee, soprattutto francesi e tedesche, non è stato difforme da quello intrapreso nel periodo pre-crisi, con una sola importante differenza. Se nel periodo pre-crisi la speculazione finanziaria, attuata prevalentemente dalle banche americane, aveva lo scopo di accrescere oltremisura le fonti di reddito, e conseguentemente i bonus dei manager bancari, nel periodo più recente gli investimenti in attività finanziarie ad alto rischio hanno lo scopo di limitare la caduta del conto economico. In altri termini, se in passato era una speculazione “aggressiva”, ad oggi invece constatiamo una speculazione “difensiva”, volta cioè a far fronte ai bassi tassi d’interesse e a contrastare la concorrenza del fintech.

A livello sistemico questo approccio sta contribuendo all’instabilità finanziaria globale. Una misura del rischio sistemico è offerta dalla New York University Stern Volatility Lab attraverso l’SRISK, un indicatore che misura l’ammontare di capitale richiesto dalle società quotate per far fronte ad uno scenario di stress che prevede una riduzione del 40%, nell’arco di 6 mesi, dei mercati finanziari globali. Analizzando la relazione tra attività rischiose, connesse alla finanza e al credito, con l’SRISK si riscontra che il maggior contributo all’instabilità finanziaria globale è correlato al rischio di mercato. L’incidenza dei non-performing loan non sembra invece avere effetti di rilievo sul rischio sistemico (grafico 1).

 

Grafico 1. Rischio sistemico, asset finanziari complessi e NPL (dati relativi al periodo 2014-2016)

 

Note: campione delle prime 20 banche significative in Europa. Dati espressi in % del totale campione.

Fonte: elaborazioni CER su dati Mediobanca.

La sottovalutazione del rischio di mercato potrebbe avere effetti rilevanti sul sistema finanziario. Il maggior rischio sistemico non spinge infatti i manager bancari a detenere più capitale, anzi avviene paradossalmente l’esatto contrario: chi ha più rischio sistemico detiene una minore proporzione di capitale di buona qualità rispetto al totale attivo (grafico 2).

 

Grafico 2. Rischio sistemico e dotazione di capitale (dati relativi al periodo 2014-2016)

 

Note: campione delle prime 20 banche significative in Europa. SRISK espresso in % del totale campione.

Fonte: elaborazioni CER su dati Mediobanca.

 

Questo contesto dovrebbe destare ancor più preoccupazione se si considera che i mercati finanziari internazionali stanno segnalando da diverso tempo la presenza di un certo surriscaldamento. Tra le diverse concause possono citarsi l’aggressiva politica commerciale di Trump, l’inversione dell’intonazione delle politiche monetarie, l’eccessivo ricorso al debito da parte dei paesi emergenti (soprattutto attraverso il canale dello shadow banking system) e, da ultimo, i timori legati all’Italexit, ovvero l’uscita dell’Italia dall’Area euro.

Guardando alla volatilità di Borsa negli Stati Uniti e nell’Area euro si nota come questa sia ancora contenuta, attestandosi su livelli osservati nella fase precedente alla crisi del 2007-2008, ma come allo stesso tempo segnali da alcuni mesi una forte tendenza crescente (grafico 3). Anche lo spread tra i corporate bond con rating BBB e i titoli governativi da indicazioni analoghe, con livelli storicamente molto bassi sia negli Usa che in Europa ma con una rapida inversione di tendenza nel periodo recente (grafico 4).

Grafico 3. Volatilità di Borsa

Fonte: elaborazioni CER su dati Thomson-Reuters.

 

Grafico 4. Spread tra i rendimenti dei corporate bond con rating BBB e dei titoli governativi

Fonte: elaborazioni CER su dati Thomson-Reuters.

In definitiva, aver trascurato il rischio di mercato, focalizzandosi prevalentemente su quello di credito, potrebbe nuovamente rivelarsi un problema per la stabilità finanziaria. Data l’insufficiente dotazione di capitale delle banche più esposte con le attività finanziarie, soprattutto se misurata attraverso coefficienti di patrimonializzazione non manipolabili (si veda Barucci e Milani, 2018), in caso di crisi sarebbe inevitabile chiedere nuovamente l’intervento degli Stati nazionali.

 

Bibliografia

– Barucci E., Milani C., Do European banks manipulate risk weights?, International Review of Financial Analysis, Volume 59, pp. 47-57, North-Holland, 2018.

– CER, Rapporto Banche 1/2018.

Ott 112018
 

The revised Payment Services Directive (PSD2) aims to contribute to the development of the EU market for electronic payments, where consumers, retail operators and payment service providers will be able to enjoy advantages offered by the internal market of the European Union.

 

In particular, the new Community rules aim to:

  • stimulate competition by promoting innovative payment methods,
    • imposing a supervisory obligation also for suppliers of non-traditional payment systems (e-commerce payments)
    • reducing entry barriers for some types of payment service providers
    • forcing banks to allow access to their Third Party Provider (TPP) infrastructures through standardized APIs (Application Programming Interfaces)
  • protect the consumer and improve security in the use of payment services,
    • providing for more transparent transaction costs and a ban on the applicability of “surcharging” to the customer in the case of electronic payments
    • improving authentication procedures and data protection measures
    • Increasing customer protection in case of unauthorized payments.

Below is a non-exhaustive list of some of the new rules of law.

  • The PSD2 introduces two types of TPP: the Payment Initiation Service Providers (PISP) ​​and the Account Information Service Providers (AISP). Banks will be required to allow access of their back-end systems to TPPs in the first case following requests for initialization of payment transactions₂,₃, in the second case to requests for information on accounts held by their clients (with their authorization).
  • The PSD2 requires that “strong customer authentication” (SCA) measures be applied whenever, in carrying out payment transactions through traditional financial institutions or third-party suppliers, the service user:
    • accesses your online account
    • has an electronic payment transaction
    • performs any operation, through remote channels that involves a risk of fraud or abuse.

These measures involve the use of at least two independent factors: “knowledge” (security question, password), “possession” (token, personal device or “digi-pass”), “inherence” (fingerprint, retina data) ₄,₅,₆.

The publication of the RTS and the questions still open

The process of implementation of the PSD2 Directive has developed in a complex path. One of the main steps of this path is certainly the publication of the Regulatory Technical Standards (RTS) on Strong Customer Authentication (SCA) and Common Secure Communication (CSC), which took place on March 13th, 2018. On June 13th 2018 EBA published his Opinion on the “implementation of the RTS on SCA and CSC” ₇.

The definition of RTS and the related opinions are fundamental elements of the PSD2 framework, but the documents leaves some important issues open.

The text of the RTS will apply from September 14th 2019, but as of March 14th 2019, the “Account Servicing Payment Service Providers” (ASPSPs) will have to make the technical specifications of their access interfaces available to TPPs and provide them with a test environment to carry out tests of the applications that TPP will use to offer services.

The RTS only specifies that the ASPSPs must ensure that their interfaces follow the communication standards issued by international or European standardization organizations.

The Commission, recognizing that the lack of detailed requirements may lead to application problems, proposed the creation of the Application Programming Interface Evaluation Group (API EG) to evaluate the API specifications to ensure that they comply with the PSD2 and other applicable regulations (i.e. General Data Protection Regulation – GDPR).

The recommendations issued by the EG API will aim to create harmonized market practices among EU Member States in order to reduce implementation time and costs for the actors involved.

Further, open points with respect to the General Data Protection Regulation
The European General Data Protection Regulation (GDPR), which became enforceable in May this year, poses some additional questions regarding the PSD2, such as:

  • Determine who is responsible for obtaining consent from customers to enable banks to share their payment information with TPPs.

This is because if PSD2 foresees that TPPs can directly access the customer’s payment account information, provided that they have their explicit consent, using banks’ infrastructure to facilitate provision of payment initiation or account information services.

Under the GDPR, banks are responsible for the processing of their customers’ data and are responsible for the purposes and the manner in which personal data are processed and shared.

PSD2 adds data protection requirements by stating that TPPs are permitted to access the information only for specific purposes “explicitly requested by the customer” related to the provision of account information or payment initiation services.

Therefore, considering these interacting requirements, it seems that while TPPs will likely initiate the process of securing customers’ consent, including consent for their own activities and use of the data once obtained, banks will ultimately remain responsible for confirming, or otherwise separately obtaining, the consent directly with their customers.

Furthermore, EBA in his recent Opinion required above expressed his conviction regarding the fact that, if an AISP or a PISP provides services to a Payment Service User (PSU) on the basis of a contract signed by both parties, then the ASPSPs do not have to check consent. It suffices that the AISP and PISP can rely on the authentication procedures provided by the ASPSP to the PSU, when it comes to the expression of explicit consent. From our point of view, it’s not clear how the banks can verify the will of their customers and how the contractual obligations are going to involve the bank. In this sense, a joint pronouncement by EBA and EDPB is desirable.

  • Determine what constitutes “sensitive payment data”.
    The aforementioned RTS on SCA and CSC in PSD2 establish that banks must provide AISP with the same information made available to the customer₈,₉, when he accesses his own account information directly, if this information does not include “sensitive payment data”. Unfortunately, neither the RTS nor the PSD2 define the meaning of “sensitive payment data”, leaving to the discretion of the banks the task of determining which data they consider sensitive.

GDPR defines “personal data”, and therefore protects, such as any information relating to an identified or identifiable natural person. However, it also allows EU Member States to specify their own rules ” for the processing of special categories of personal data (‘sensitive data’)”, defined as personal data revealing racial or ethnic origins, political opinions, religious beliefs or philosophical beliefs, or union membership and processing of genetic data, biometric data.

The risk, in the absence of specifications on the point, is that the rule will be interpreted in a less restrictive way, facilitating access to additional and unnecessary information with respect to the purposes indicated in the standard increasing the risk of non-compliance.

It is necessary to change the pace
Further guidance by national and EU regulators is urgently needed on how companies can reconcile the requirements under PSD2 and the GDPR, both in the interim period and thereafter. It is desirable that companies manage the GDPR and PSD2 implementation programs in a coordinated way, taking into account reciprocal conditioning.

On the other hand, with the finalized RTS and the timing of implementation deadlines clarified, the companies should proceed quickly, clarifying their strategic positioning and then proceeding on the design and implementation of their communication interfaces, on SCA solutions, on the definition of operating models for the management of interaction with TPPs. All of these points will allow companies to face a rapidly-changing competitive environment such as the one enabled by PSD2₁₀.

David Mogini – Partner Deloitte Consulting

Michele Paolin – Partner Deloitte Consulting

 

Notes

  1. This publication has been written in general terms and we recommend that you obtain professional advice before acting or refraining from action on any of the contents of this publication. Deloitte LLP accepts no liability for any loss occasioned to any person acting or refraining from action as a result of any material in this publication
  2. The EBA also clarified in his Opinion issued on June 13th 2018 that PISPs have the right to initiate the same transactions that the ASPSP offers to its own PSUs, such as instant payments, batch payments, international payments, recurring transactions, payments set by national schemes and future-dated payments.
  3. The EBA also clarified in his Opinion issued on June 13th 2018 that PISPs have the right to initiate the same transactions that the ASPSP offers to its own PSUs, such as instant payments, batch payments, international payments, recurring transactions, payments set by national schemes and future-dated payments.
  4. The EBA also clarified in his Opinion issued on June 13th 2018 that the two factors in SCA need to belong to two different categories (the categories being knowledge, possession, inherence).
  5. The EBA also clarified in his Opinion issued on June 13th 2018 that SCA has to be applied to access to payment account information and to every payment initiation, including within a session in which SCA was performed to access the account data, unless an exemption under the RTS applies.
  6. The PSP applying SCA is the PSP that issues the personalised security credentials. Therefore, it is the same provider that decides whether or not to apply an exemption in the context of AIS and PIS. The ASPSP may, however, choose to contract with other providers such as wallet providers or PISPs and AISPs for them to conduct SCA on the ASPSP’s behalf and determine the liability between them.
  7. On June 13th 2018, EBA also published the document “Draft Guidelines on the conditions to be met to benefit from an exemption from contingency measures under Article 33(6) of Regulation (EU) 2018/389 (RTS on SCA &CSC)”.
  8. The EBA also clarified in his Opinion issued on June 13th 2018 that, AISPs can access the maximum amount of data available to PSUs with regard to their payment account(s) help with a specific ASPSP regardless of the electronic channel used to access it. I.e. if there are more data available through a computer connection online than through a mobile app, the AISP is able to access, via the ASPSP’s interface, the data available on the computer online, regardless of the channel used by the PSU to access the AISP.
  9. The scope of data to be shared with AISPs and PISPs by the ASPSP does not include the PSU’s identity (e.g. address, date of birth, social security number).

 

For further reading on this topic please visit

Ott 062018
 

Life insurance products are often characterized by a minimum guarantee: the insurance company manages funds guaranteeing a minimum return to policyholders (with profit products).

In a recent paper with Elisa Mastrogiacomo we investigated how the presence of a minimum guarantee affects the asset manager’s strategy assuming that the liability of the insurance company is partially charged to the asset manager, see also Dong He and Kou (2018) and Lin et al. (2017).

Usually, the funds of policyholders are pooled together in a segregated fund, the insurance company manages it in order to refund claims and lapses of policyholders. The company is remunerated through a constant fee, and a fee that depends on the assets under management (AUM) of the fund (asset management fee) or through a share of the surplus over the guarantee in case it is positive and zero otherwise (performance fee). In some cases, a combination of the two schemes is at work.

Life insurance products with a minimum guarantee establish that the insurance company is endowed with a liability in case the fund goes below it. If this is the case, then the insurance company has to refund the performance gap to policyholders and, therefore, the company is short of a put option written on the AUM of the fund. This type of contract affects the management of the segregated fund by the insurance company.

We have investigated the asset management problem in a dynamic setting assuming that the payoff of the asset manager is made up of a constant fee, an asset management/performance fee and the liability in case the performance target is not reached. The guarantee is defined as a threshold on the AUM. We assume that the manager’s remuneration decreases in case the AUM is below the guarantee threshold, concurring to the loss of the insurance company, but it cannot become negative. Therefore, it is the insurance company, with its revenues from other activities or its capital, that ensures the payment of the minimum return to policyholders, while the manager only concurs to the loss in the sense that her remuneration is negatively affected if the minimum guarantee is not reached. We deal with a stochastic and a constant risk interest rate.

We show that an asset management fee or a performance fee lead to a similar investment strategy with the latter yielding a lower level of risk exposure (investment in the risky asset). We show that the manager may invest in the risky asset even if the put option is in the money, i.e., when AUM are below the threshold of the guarantee. In that region, the investment is hump shaped: when the put option is deep in the money (AUM are far away from the threshold) the manager doesn’t invest in the risky asset; as AUM increase, the investment in the risky asset increases and then decreases just below the guarantee reaching a null investment for a level of AUM that allow to reach the minimum guarantee. At that point there is a kink, then the investment in the risky asset increases again converging towards the solution obtained without constraint in the region where the guarantee is satisfied (Merton solution). The strategy is depicted in the figure below: we consider a market with a risk-free asset (a bond) and a risky asset (a stock), and we plot the percentage of the AUM (1=100%) invested in the risky asset  by the manager, setting the minimum guarantee threshold equal to 1.

If the company is remunerated also through a constant fee, then the investment strategy may be hump shaped also above the threshold yielding excess risk taking with respect to the Merton solution. This result confirms that, contrary to common wisdom, a remuneration based on a fixed fee leads to excess risk taking, see Ross (2004) Barucci et al. (2018) .

 

Bibliography

Emilio Barucci, Gaetano La Bua and Daniele Marazzina (2018) On relative performance, remuneration and risk taking of asset managers, forthcoming in Annals of Finance.

Emilio Barucci, Daniele Marazzina and Elisa Mastrogiacomo (2018) Optimal investment strategies with a minimum performance constraint, working paper.

Xue Dong He, and Steven Kou (2018) Profit sharing in hedge funds, Mathematical Finance, 28: 50-81.

Hogcan Lin, David Saunders, and Chengguo Weng (2017) Optimal investment strategies for participating contracts, Insurance: Mathematics and Economics, 73: 137-155.

Stephen Ross (2004) Compensation, incentives, and the duality of risk aversion and riskiness, The Journal of Finance, 59: 207-225.

Set 232018
 

Quale sarà l’impatto dell’ondata tecnologica di Intelligenza Artificiale (AI) e Machine Learning (ML), pilastri della Data Science, sull’industria del Wealth Management (WM)? Di certo l’ondata Fintech sta portando e porterà novità nella matura e relativamente poco digitale industria del WM. La parte forse più elettrizzante di questa rivoluzione tecnologica in atto riguarda AI e ML. Si tratta di strumenti incredibilmente potenti che risolvono già una discreta quantità di problemi della vita di tutti i giorni, oltre a imperversare (spesso in modo del tutto casuale e inappropriato) nei convegni Fintech, essendo ormai oggetto di discussioni da bar.

Per capire il possibile impatto di ML e AI sul vasto mondo del WM, mettiamo prima a fuoco l’argomento, facendo chiarezza sul significato di queste parole, spesso confuse tra loro e utilizzate a sproposito. Vediamo allora che cosa si cela dietro al gergo della Data Science.

Intelligenza Artificiale, Machine Learning e Data Science

C’è una bella distinzione tra ML e AI: il ML ha a che fare con l’inferenza, le previsioni, l’individuazione di pattern nascosti nei dati, il ragionamento automatico, la rappresentazione della conoscenza. Si tratta di generare e condensare informazioni che aiutano a prendere decisioni migliori. E fare previsioni. Ad esempio, sfruttare i Big Data per individuare nuovi cluster di clienti, nuovi bisogni ai quali associare prodotti, creando la giusta mappatura, analizzare e simulare il client conversion funnel… si può continuare per paginate e paginate, perché le applicazioni sono vastissime.

L’AI, intesa in senso stretto, è qualcosa di più del ML. L’AI è un computer che agisce come un Homo sapiens. Pertanto occorre che la macchina sappia interagire con gli umani utilizzando gli strumenti del Natural Language Processing, sfruttando al contempo gli algoritmi di Machine Learning . Le applicazioni tipiche sono i chatbot e gli assistenti virtuali.

Figura 1 – Strettamente parlando l’Intelligenza Artificiale differisce dal Machine Learning per l’uso del Natural Language Processing (NPL) – vale a dire metodi di riconoscimento vocale, analisi testuale automatica, comprensione generazione di testi e discorsi –consentendo a un computer di agire come un umano.

Spesso l’idea di AI è però più ampia, e coincide con l’idea di Data Science, area multidisciplinare che si colloca all’interesezione di statistica, computer science, scienze sociali e Data Visualization (DataViz), senza dimenticare le conoscenze di dominio in termini di business (cioè la Business Intelligence). Al crescere dell’automazione nell’industria finanziaria si è iniziato a parlare di Financial Data Science: essenzialmente, e in modo non particolarmente sorprendente, applicazioni di Data Science alla finanza.

Nella pratica, tutte queste distinzioni pseudo-formali lasciano il tempo che trovano: la pratica consiste nell’utilizzare i dati a disposizione e tutti gli algoritmi e le tecniche di analisi note per estrarre valore dai dati. Occorono informazioni spendibili, che aiutano concretamente a generare più ricavi e/o ridurre costi. Il resto non conta.

Figura 2 – Financial Data Science: un’area interdisciplinare che richiede different skills.

Financial Data Science: il vantaggio degli incumbent

Con l’affermarsi dei modelli di business “data –driven”, nell’industria della gestione dei risparmi grande è il timore degli “incumbent”, gli operatori tradizionali, nei confronti dei giganti della tecnologia: Amazon, Alphabet/Google, Facebook e Apple in testa. Comprensibile. Hanno masse enormi di utenti. E li conoscono alla perfezione, grazie alla loro capacità di analizzare i dati – la frontiera della Data Science non è certo nelle università (mi spiace), ma lì. Quando i Big Tech decideranno di entrare con forza sul mercato dei risparmi e degli investimenti, sarà battaglia. Gli intermediari finanziari tradizionali hanno però parecchie armi a loro disposizione.

Primo vantaggio: i dati

Praticamente ogni settore dell’economia ha accesso a una quantità di dati inimmaginabile anche solo una decina d’anni fa – e l’industria del WM non fa eccezione. Banche, assicurazioni, asset manager, hanno infatti un bel po’ di dati dai quali estrarre informazioni di enorme valore grazie alla Financial Data Science.

Tipicamente gli intermediari che si occupano di gestire gli investimenti sono in possesso dei seguenti tipi di dati:

  • finanziari, relativi a posizioni e movimenti presenti e passati dei clienti e a flussi di pagamento – dati dai quali si possono ricavare, informazioni sulle dinamiche d’investiemnto, nonché sulle abitudini di consumo/risparmio;
  • socio-demografici, come età, luogo di nascita e residenza, sesso, situazione familiare e via dicendo, fondamentali, ad esempio, per inquadrare l’investment life-cycle del cliente;
  • le risposte al questionario Mifid (che, se ben disegnato e compliato correttamente, è una miniera d’informazioni), cruciale per estrarre il DNA finanziario del cliente;
  • dati d’interazione cliente-intermediario, come quelli legati alla fruizione del sito, all’apertura di eventuali newsletter, uso di app, conversazioni telefoniche (rammento che devono essere conservate per cinque anni e se ne possono estrarre indici di attitudine e sentiment).

Anche senza dati ulteriori (legati ad esempio ai social media come Linkedin, Facebook, Twitter, o ad attività specifiche di “smart engagement”, come quiz, gaming, e via dicendo) è chiaro che si tratta di un patrimonio informativo notevole.

Innanzitutto sono informazioni ricchissime, perché specifiche: riguardano la sfera economico-patrimoniale. E poiché stiamo parlando di risparmi e investimenti, questo è evidentemente molto più rilevante delle passioni per teneri gattini o il meme del momento che si possono trovare su Instagram e Facebook.

Si tratta inoltre di un data-set che può essere “aumentato” – senza fare grandi voli di fantasia – incrociandolo con varie fonti dati esterne, in primis i dati dei mercati finanziari, quelli dell’economia e le news. Poi, volendo, vi sono svariati “alternative data sets”, ad esempio quelli legati a sentiment analysis, o geospaziali.

Vantaggio ancor più considerevole è che, in ottica GDPR, gli intermediari sono pienamente titolati a macinare questi dati in loro possesso (possesso pienamente autorizzato, cosa che forse non sarebbe sempre vera per i Big Tech), in quanto si tratta di dati inerenti la sfera finanziaria, utilizzati per risolvere problemi finanziari, quelli alla base del rapporto contrattuale.

Ma in che modo AI e ML possono concretamente aiutare l’industria del wealth management? Le applicazioni sono moltissime, e possono impattare l’intera “value chain”. Per esempio, con il supporto di dati e algoritmi, si possono affrontare e risolvere problemi come::

  • individuare e comprendere i bisogni finanziari dei clienti e i loro obiettivi reali;
  • costruire modelli predittivi del comportamento dei clienti (ad esempio se acquisterà o meno un dato prodotto finanziario/assicurativo);
  • individuare i clienti attualmente piccoli ma con elevato potenziale di crescita;
  • migliorare la segmentazione dei clienti, offrendo loro soluzioni d’investimento e servizi accessori personalizzati, migliorando la user experience a costi molto bassi;
  • individuare quali bisogni finanziari sono soddisfatti da un dato prodotto, e per quali obiettivi è consigliabile;
  • supportare le reti di consulenti finanziari, agenti e altri relationship manager con informazioni mirate sui clienti e recommendation systems relativi alle migliori soluzioni da offrire ai clienti, in base ai loro specifici bisogni e caratteristiche;
  • gestire la compliance in tempo reale;
  • simulare l’impatto di eventi di mercato su processi, masse in gestione, costi e margini – attuando qualle che è probabilmente la più utile forma di risk management per un wealth manager (molto più che calcolare il VaR al 99% a una settimana sui portafogli dei clienti);
  • catturare e analizzare nuove fonti di dati.

Figura 3 – I dati dovrebbero essere al cuore del processo di wealth management, supportando e indirizzando tute le principali azioni.

Quindi le aziende finanziarie possono estrarre valore tangibile da numerose fonti dati interne ed esterne utilizzando gli strumenti della Data Science. Tuttavia, la quantità di dati non è l’aspetto più rilevante.

Secondo vantaggio: conoscenza di dominio

I dati sono la materia prima, certo. E la loro quantità rileva, ma la qualità rileva di gran lunga di più. La rilevanza dei dati in termini di business è strategica: nei problemi di ML legati alla gestione dei risparmi, gettare nel bidone degli algoritmi supervisionati o non-supervisionati i dati in modo indiscriminato raramente è una buona idea. Il settore finanziario è infatti fortemente regolamentato, con prassi professionali dalla logica forte e ben consolidata: sarebbe follia non includere queste informazioni “di struttura” nel processo di number crunching. La “features selection”, cioè la selezione delle variabili di input è cruciale se non si vogliono avere algoritmi che funzionano bene in fase di training ma sono incomprensibili e funzionano male nell’opertività quotidiana.

La comprensione del business che sta dietro e intorno ai dati è di gran lunga più importante della soluzione tecnica, cioè di sviluppo di modelli complessi ma fini a sé stessi, e deve essere tenuta in conto nello sviluppo dei modelli stessi. Modelli black-box molto grandi, nella nostra esperienza (Virtual B SpA in larga parte di questo si occupa) tendono all’overfitting e al data snooping: termini che, in sostanza, significano che il modello non ha penetrato davvero la logica del problema che vuole risolvere, bensì ha operato una sofisticatissima interpolazione con scarso valore predittivo. Più dati ci sono, più complesso è il modello, meno conoscenza di dominio si ha, e più è probabile, insidioso e difficile da riconoscere l’overfitting.

Inoltre, per molte applicazioni algoritmiche legate alla gestione dei risparmi i regulators vogliono poter effettuare il “look through”, cioè aprire la scatola del modello per capirne le logiche e i nessi causali. Ragionevole. In questi casi, aver attuato un’analisi via deep learning, per esempio, scaraventando dentro la black box tutte le informazioni possibile, non è esattamente una grande idea.

Ecco perché la conoscenza di dominio è fondamentale, se si vuole utilizzare la Data Science per azioni concrete, misurabili, con un elevato ROI sugli investimenti in tecnologia. Questo è un enorme vantaggio degli intermediari finanziari. Questo è il primo articolo di una serie: nei prossimi vedremo alcune applicazioni concrete.

Take home

Gli intermediari finanziari sono già oggi in possesso di basi dati di grande valore. Si tratta di sfruttarle con intelligenza e senso pratico, mettendo al lavoro gli strumenti offerti dalla Financial Data Science, cioè algoritmi di Machine Learning e tutto ciò che viene in senso lato connotato come Intelligenza Artificiale.

La Financial Data Science, in breve, fa esplodere efficienza e scalabilità. Che si traducono in maggiore produttività. Cioè margini migliori. Di questi tempi, non è male.

Set 132018
 

The financial system is undergoing a more and more important, pervasive and disrupting technological evolution.

Most of the financial market players are evolving their business models into a FinTech direction, leveraging on the more efficient, independent and flexible technology which is able to learn through an iterative process of self-learning and error correction.

Following the recent 2008 financial crisis, the Supervisory Controls on behalf of Regulators have become more pervasive, detailed and intransigent, recording cumulative penalties for ca. $ 200 bn since 2008 because of the failures to regulatory requirement compliance (Douglas, Janos, & Ross, 2016).

The fear of a new light supervision over banking activities along with a worsening of the financial markets health have suggested an enlarged scope of regulatory requirements and greater reporting effort, data disclosure and data quality processes.

RegTech addresses these needs by introducing in a cost-effective and simple way, a new technology that is able to offer flexibility in data and report’s production, automation in terms of data quality, improvement in data management for analysis and inspections purposes (e.g. cognitive learning applied to RAF).

However, RegTech implies significant changes in compliance approach for banking institutions and consequently, it places new challenges to Regulators’ infrastructural capabilities. Supervisors are involved in this disrupting process, they have to acquire technological and analytical tools able to process and analyze an increasing amount of data requested.

 

RegTech: an evolving framework

Regulatory developments in action

New regulations, introduced as a consequence of the recent financial crisis, have increased controls of financial institutions both in terms of banking prudential capitalization (minimum capital requirement for operational, credit and market risks under Basel III, FRTB) and data disclosure for Regulators and customers (MiFID II, PSD 2).

The rationale behind these changes is to determine a homogeneous capitalization scheme among all banks in order to provide Supervisor with the chance to compare and efficiently aggregate banking risks and to achieve an overall picture of the banking system.

In this context, the standard models, as well as tests under stressed conditions, will become mandatory for all banks (including those that use internal models and therefore necessarily adapted to their needs) and represent the regulatory floor to RWA. For this purpose, a set of widespread exercises (EBA Stress Tests) and inspections (TRIM) have been set up.

In this fast evolving framework, banking institutions are experiencing an inevitable growth of the burden of analysis, reporting and public disclosure of their status towards Supervisors, resulting in greater economic expenditures and IT architecture developments

Digitalization: FinTech transformation in the FSI industry

In order to grasp the new opportunities available on the market, Financial Industry is progressively increasing the use of technology in various areas at different levels where it is involved, for example payment service, digital banking and lending.

These new technologies applied to the financial world is now called FinTech. Different sub-sectors of Financial Industry can benefit from the development and application of this new science such as:

  • Artificial Intelligence
  • Blockchain
  • Cyber Security
  • BigData

RegTech Information

The RegTech has been identified, for the first time, by FCA (Financial Conduct Authority) (Imogen, Gray, & Gavin, 2016) as “a subset of FinTech that focuses on technologies that may facilitate the delivery of regulatory requirements more efficiently and effectively than existing capabilities”. It addresses both the need of banking institutions to produce, as fast as possible, reports for regulatory requests and the creation of a new framework between Regulator and financial institution, driven by collaboration and mutual efficiency.

RegTech is becoming the tool to obtain a greater sharing of information between the parties, by reducing the time spent to produce and verify data and by performing jointly analyses, both current and prospective through mutual skills.

 

Implications for banking institutions

RegTech will allow institutions to develop a new way of communicating their data to the control authorities and to the whole financial system by exploiting a more efficient Risk Management and an advanced Compliance management.

Digital Compliance

The RegTech Council (Groenfeldt, 2018) has estimated that, on average, the banking institutions spend 4% of their revenues in activities related to compliance regulation and, by 2022; this quote will increase by around 10%. In this area, the transition to an advanced and digital management of compliance would bring around 5% of cost-savings. Concerning the new regulations introduced for trading and post trading areas, the RegTech would help managing the huge amount of data referred to transactions, KPI, market data and personal data related to customer profiling.

Particularly, a Thomson Reuters survey (Reuters, s.d.), has estimated that, in the 2017, the process of information checking of new users lasted around 26 days. The cost of Customer Due Diligence for the intermediaries is on average around $40 million per year. This is due to the inefficiencies of the actual processes, to the increase of FTE required as result of the increasing controls defined by Regulators and to the loss incurred by the stop of customer profiling practices, which are extremely slow and complex.

Risk Management 2.0

In the last few years, the role of the Risk Management has significantly changed from a static supervision of Front Office activities to a dynamic and integrated framework involving all Bank’s divisions.

However, the evolution of Risk Management not only goes through a change in its approach but also through a substantial IT architecture revolution.

The main habit is creating a fully integrated risk ecosystem able to feed itself from many banking systems, performing checks and regular data monitoring through cognitive learning.

The change of position from an independent risk management to a completely centralized one allows obtaining advantages as:

  • cost reduction ensuring a unique architecture;
  • greater flexibility in its updating / evolution;
  • reduction of the effort (FTE) for the intra system reconciliation;
  • greater uniformity in the compliance checks;
  • standardization of information sources for all disclosures maintaining consistency.

Nowadays, considering the status of the RegTech area, populated by many start-ups and differentiated solutions without a consolidate best practice, there are some barriers to its completed implementation:

  • preference in financial industry for business investments rather than innovation;
  • the needed investments cannot rest on previous investments in Compliance;
  • continuous changes about the regulatory requirements still ongoing;
  • challenges for the RegTech start-ups to interface with the IT structures, potentially not adequate for the ongoing system.

Implications for the Regulators

Financial System increasingly focuses its attention on the supervisory authorities by requiring them a costs reduction/complexity in the face of a greater quality of the controls put in place.

If on one hand, the institutions try to apply FinTech’s innovations to their own disclosure activity, on the other hand, it is advisable that the Regulators invest in similar technological innovations in order to manage a considerable amount of required data through new regulations.

The potential benefits are:

  • creation of preventive compliance system developed to anticipate any breach
  • performing real-time analysis and checks rather than only historical ones
  • possibility to carry out more complete analysis on a wider data panel and not only on aggregations already provided without the underlying details
  • implementing simple tools to increase the information level against a reduction of the overall effort (eg. fingerprint for the access to the trading platforms).

All of this also brings benefits to the entire financial system:

  • defining framework at national level but especially at international level so as to reduce progressively the potential arbitrages between markets
  • increasing the flexibility (both on banking institutions and Regulators) to analyze different sets of data by avoiding development costs and the related implementation time as result of these changes
  • making compliance architecture an useful analysis tool to monitor impacts of new regulation through an ad-hoc scenario analysis.

Conclusions

RegTech represents one of the most important challenges of the financial system, which is able to modify structurally the global financial markets.

Despite the entry barriers to the use of technology in a regulatory environment, the banking institutions will necessarily have to evolve their way to relate with Regulators by making the process less costly and more efficient and at the same time by maintaining their competitiveness on the market.

With regard to this, the approach suggested by the literature is to acquire pilot or “sandbox” cases in order to adopt gradually an innovative process without causing potentially negative impacts.

Particularly, the institutions need this application in the trading field where the new regulations require a remarkable quantity of data and computational speed not manually sustainable.

In addition, the Supervisory Authority is willing to avoid the situation where it has the needed information for supervisory purposes but it is not able to analyze them promptly and correctly. For this purpose, the Supervisory Authority are slightly more static because they are still stuck on the old paradigm and on the review of the main regulations.

 

Alberto Capizzano – Director Deloitte Consulting

Silvia Manera – Manager Deloitte Consulting

Bibliography

Douglas, W. A., Janos, N. B., & Ross, P. B. (2016, October 1). FinTech, RegTech and the Reconceptualization of Financial Regulation. Northwestern Journal of International Law & Business, Forthcoming, p. 51.

Groenfeldt, T. (2018, 04 4). Understanding MiFID II’s 30,000 Pages Of Regulation Requires RegTech. Forbes, p. 1.

Imogen, G., Gray, J., & Gavin, P. (2016). FCA outlines approach to RegTech. Fintech, United Kingdom, 1.

R.T. (s.d.). KYC onboarding still a pain point for financial institutions: https://blogs.thomsonreuters.com/financial-risk/risk-management-and-compliance/kyc-onboarding-still-a-pain-point-for-financial-institutions/

 

Set 082018
 

Under Solvency II (SII), insurance and reinsurance companies have to evaluate their assets and liabilities following harmonized principles, among which the discounting of the liabilities cash flows through a risk free interest rates yield curve, that EIOPA has been publishing on a monthly basis since February 2015. The Authority does not state anything else regarding the future evolution of the risk free yield curve or the volatility it should show, which count a lot in the determination of the liabilities value.

If the liabilities were characterized by having no optionality, their present value could be easily calculated discounting the cash flows though the deterministic yield curve provided by EIOPA; in truth, insurance products usually offer a guarantee on a minimum level of return credited to policyholder fund. This optionality must be priced in a risk neutral framework, likely making use of Monte Carlo simulations. Whatever model is chosen to project the shape of the interest rates in the future, it must satisfy the risk neutrality principle (i.e. all the assets are expected to earn, on average, the risk-free rate). Roughly speaking, this means that, if we consider an asset that is worth X euros in t=0 and capitalize and discount it over the N paths of the projected interest rates, the average value of the N evaluations should be X. Although, Liabilities cash flows are not that simple and the volatility of the projected rates plays an important role when defining the value of the optionality they embed. The Interest Rate (IR) model adopted is calibrated such that the projected interest rates are Market Consistent (MC), that is, capable of replicating some Implied Volatilities (IVs) quoted in the market. The question is: which IV shall be targeted? It is worth noticing that we move from talking about the volatility of the interest rates, used to price the optionality (and therefore the IV) of the liabilities to a different IV, that comes from the market, from completely different types of options. When thinking about the market, another important piece of information is that the price of an instrument is unique, while its IV depends on certain assumptions: it is implied by a specific formula. Another key element is that prices are the unique quantity definable via Monte Carlo simulation, while IV shall be derived from the former.

The SII regulation does not provide any answer and the choice is left to the insurance and reinsurance undertakings, which can select the IR model they prefer and its market target for the calibration purposes. Actually, the Standard Formula does not even consider the IR volatility as a risk.

The IVs quoted in the market are those of the Cap, Floor and Swaption contracts, all the three being defined over an Interest Rate Swap (IRS).The IRS is a derivative instrument where two parties agree to exchange IR cash flows from a fixed () to a floating () rate, or vice versa. The IRS is called payer/receiver when the fixed rate is paid/received. The fixed rate that makes the contract fair is called forward swap rate and is equal to:

where is the time when the contract starts and the number of years it lasts.

Caps and Floors can be respectively seen as options on a payer/receiver IRS, where the money exchange is set in favorable circumstances only. The present value of the payoff of a Cap is:

Caps can be used for hedging purposes when one is debtor of the floating rate . Indeed, when holding such a contract, the whole exposition becomes , that does not exceed the fixed rate . Caps and Floors are made up of sequences of Caplets and Floorlets, each one defined over a certain époque and referred to a certain forward rate. A Cap contract is said to be ITM (In), ATM (At) or OTM (Out of The Money) when K is respectively <, = or > than the fair value, and the difference between K and the fair value is called moneyness, so

A payer/receiver Swaption contract is an option granting its owner the right but not the obligation to enter in into an underlying payer/receiver IRS of tenor . The actual value of a payer Swaption is:

Because of the Jensen inequality, this value is never grater that the Cap correspondent one. Likewise the Cap, a payer Swaption is said to be ITM, ATM or OTM when the strike K is respectively <, = or > than .

Until the recent past (say, before the negative rates appeared), both Caps/Floors (considered as sum of Caplets and Floorlets) and Swaptions were priced using the Black formula, supposing a lognormal distribution respectively for the forward rates and the forward swap rate. Although widely used in the market, the two pricings were incoherent, being the forward swap rate defined as a weighted average of the forward rates and the being the sum of aleatory variables with lognormal distribution not lognormal distributed (i.e. the two assumptions on the distributions of the rates cannot hold true together). As the price of a Swaption depends on the forward swap rate, knowing the volatilities of the single forward rates is no longer sufficient for the evaluation and information about the forward rates joint distribution and the correlation between different maturities are needed. Because their price embeds more information, Swaption contracts are often preferred to Caps when setting the market target for the calibration of the IR models.

Given a unique price of a Swaption, there are three types of IV quoted in the market, based on different assumptions on the distribution (Lognormal, Displaced Lognormal or Normal) of the swap forward rate:

  • LN-SIV are defined as a relative change of the forward swap rate (bigger changes for higher YC)

  • DLN-SIV are defined as a relative change of the displaced forward swap rate

  • N-SIV are defined as an absolute change of the forward swap rate, independent of its level

All the three are linked by the value of the underlying forward swap rate and, roughly speaking, there are 2 order of magnitudes between the former and the latter. Even though the relationship is not given by a simple rescaling and it is not linear, a good approximation to keep in mind is:

The recent past market environment, characterized by a number of tenors whose rate was negative, has questioned the use of LN-IV: negative/close to zero forward swap rates turn into not defined/extreme IV, which make the calibration of any IR model very challenging and potentially inaccurate.

N-IV has started to catch on, with the additional benefit of mitigating the distortion that comes from applying a market IV to a Yield Curve (YC) that is different from the market one. Indeed, the IV quoted in the market refer to the market swap YC, while the IR models used for the liabilities evaluation are based on the published EIOPA YC (much higher on the long term due to the convergence to the UFR). The misalignment of applying a certain IV (that refers to the market YC) to a higher YC (the EIOPA one) is exacerbated when LN-IV are adopted, as they are proportional to the level of the rates. The usage of N-IV in place of LN ones helps in taming the volatility embedded in the projected rates, that affects the TVOG, normally increasing the BEL value. The following picture clarifies the statement, by comparing the market ATM LN Swption IV (implicit in the market rates) to those derived from the N-IV when the EIOPA YC is used: the latter are smaller because, given the same N-IV, a higher rate appears as denominator.

After having concluded that Swaptions are more exhaustive than Caps and, in this context, N-IV are more appropriate than LN-IV, a question is left: when calibrating an IR model, which data shall be considered as target among all those available for the Swaption N-IV (N-SIV) cube (option maturity , IRS tenor , moneyness) at a certain reference date?

Let’s try not to forget the original goal: determining a value for the Liabilities, that embed a degree of optionality, priced via Monte Carlo methods. The market data chosen as target drive the calibration of the IR model chosen and, in turn, the projected rates and their distribution. Different sets of projected rates give origin to different liability values. Setting a certain target is nothing else than deciding on the properties the projected rates should have, including their volatility, keeping in mind that the IV of the liabilities will not be equal to the IV of the Swaptions as the liabilities are not Swaption contracts.

Being the N-SIV quite smoothed, there is potentially no contraindication in setting the whole SIV cube as target but the runtime needed to carry out the IR model calibration: it is a matter of balance between calibration speed and fitting accuracy, which also depends on the IR model adopted. Deciding on which data to consider introduces a degree of subjectivity and exposes one to the risk/benefit of choosing some points rather than others. What if the others matters more? On the other hand, along with reducing the runtime, the selection of a subset of data may help in discarding unrepresentative or inaccurate data points, not consistent with the surrounding ones.

Once the relevant data have been chosen, the last question is: shall they all be weighted the same way or should some of them count more than others? The easiest possibility is to assign uniform weights to all the points considered, while a more subjective one is to decide on which SIV triples should count more. There is no contraindication of going for the first choice, while the latter introduces again a degree of subjectivity. A possibility for defining a not uniform weighting scheme would be to derive it from the liabilities’ profile, but it would be a more complex approach, less stable over time and subject to introducing sensitivity between rates levels and volatilities (the liability profile depends on the level of the rates and so the derived weights, which will drive the next calibration, like the tail wagging the dog). In addition to that, one should remember that there is no direct link between liabilities and Swaption contracts: the IV quoted in the marked for the Swaptions are not becoming IVs for the liabilities.

Having said that, in case liability driven weights was still the favorite option, an insurance company should define a parallelism between the financial optionality and that embedded in insurance products to derive the option maturities, IRS tenors and moneyness it is more exposed to. To this aim, only contracts with of a minimum guaranteed rate shall be considered. A “with profit” contract, where the policyholder has the right to get a minimum guaranteed rate in case of lapse/death/maturity, could be compared to a Long European Swaption as the policyholder may chose a fixed payout at the end of his contract or a variable benefit if he takes the money and invests it at the risk-free rate; from the perspective of the insurance company, this contract would be comparable to is a Short European Receiver Swaption. One can think of:

  • the option maturity (T) as the time in which the policyholder can decide to leave the contract
  • IRS tenor (n) as the residual time since the policyholder has left the contract till its original maturity
  • the moneyness of the option (r_mg) as the difference between the minimum guaranteed rate (moneyness) and the swap forward rate S(T,T+n). In a market environment characterized by low yields and high guarantees, the liabilities would be more exposed to OTM Swaptions on the short term (a rational policyholder would prefer to stay in the contract and get the minimum guaranteed rate rather than reinvest the benefit in the market at the risk-free rate, which is lower) and to ITM Swaptions on the long term (when the market rates are higher than the guaranteed one).

To put into effect this parallelism between insurance policies and financial contracts, one needs derive a 3D matrix (T,T+n,moneyness) starting from a 1D projection of mathematical reserves and outgo cash flows for the maturity only CF(t,MAT.OUT). The mathematical reserves are used as the best proxy of the amount of money the policyholder will get in case of lapse (exercise of the option), while maturity outgoes are considered being the only outgo cash flows where the policyholder can exercise an option (neither death cash flows are considered as the policyholder cannot decide to die/survive, nor annuity cash flows are, because, when paid, they correspond to a decision already made by the policyholder – staying in the contract and not leaving getting a lump sum).

  • both V(t) and CF(t,MAT.OUT) can be split by r_mg, from which one can derive the moneyness (that gives the 3rd dimension)

  • given a certain r_mg, to transform the 1D projection into a 2D matrix one has to “recycle” the data “moving in time”
  • the first dimension (T) is naturally given by the “move in time”
  • the exposure of the Liabilities to the IRIV risk is given by the amount of mathematical reserves that the policyholders can withdraw at that time (T), given the lapse rate as the base one

  • at each (T), the second dimension (T+n) is given by the amount of contract in place at time e that are expiring in (T+n) – one moves in time considering for each row (i+1) a subset of the data used in (i), removing the first element

As the SIV cube does not include all the possible triples, the data without a correspondence have to be assigned to the nearest existing labels (e.g. tenor 6 is split equivalently to tenor 5 and tenor 7), in a “condensation” process. Even though it is be possible to identify the whole cube, the entries may be further condensed into a limited number of triples, which have been defined as target.

As already stated, the parallelism between financial and insurance options is just a parallelism that undergoes a number of simplifications, among which:

  • ignoring additional payments on top of the minimum guaranteed amounts (that happen because of the profit sharing) as well as fiscal benefits or penalties in case of early surrender
  • the surrender date is not set at maturity only: policyholders can surrender at any time until maturity (it would be more similar to a Bermudan option rather than to a European Swaption)
    • it is not exactly clear whether an insurance contract shall be addressed as payer or receiver option: looking at the payoff, the payer label would fit more (max(0,L-K) ), but from a definition perspective the policyholder actually receives from the insurance company a fixed amount, that than exchanges with a third party (the market).