Remember me

Register  |   Lost password?


Introduction to QuantLib Development - Intensive 3-day Training Course - September 10-12th, 2018 - Download Registration Form Here

 

MoneyScience Blog Header 2015 2


Counterparty credit risk, collateral and funding - an interview with Professor Damiano Brigo

Mon, 13 Jan 2014 06:27:00 GMT

Prof. Damiano Brigo is Chair and co-Head of Mathematical Finance at Imperial College, London, consistently ranked among the top 10 world universities, and Director of the Capco Research Institute in the industry. Damiano’s previous roles include Gilbart Professor and Head of Group at King's College, Managing Director and Quantitative Innovation Global Head in Fitch, Head of Credit Models in Banca IMI and Fixed Income Professor at Bocconi.

Damiano published more than 70 works in top journals for Mathematical Finance, Systems Theory, Probability and Statistics, and books for Springer and Wiley that became field references in stochastic interest rate and credit modelling. Damiano is Managing Editor of the International Journal of Theoretical and Applied Finance.

Damiano's interests include pricing, risk measurement, counterparty credit risk collateral and funding, stochastic commodities and inflation modelling, exponential and mixture manifolds and nonlinear filtering.

Damiano holds a PhD in stochastic filtering with differential geometry.

To get 30% off your copy of this book, simply enter promotion code MON30 at the checkout when you buy direct from www.wiley.com

Jacob Bettany: Thanks very much for joining us Professor Brigo. You will be very well known to our community already, I'm sure, but for the benefit of those who don't know you so well, would you mind telling us a little bit about yourself, and how you came to be involved in this field?

Damiano Brigo: Thank you Jacob, I may introduce myself by saying that I have worked in the financial industry for 13 years as a quantitative analyst (Quant) at different levels, from a simple analyst to a managing director, in different institutions, from investment banks to consulting firms and rating agencies, all the while producing original research in quantitative finance and cooperating with academic institutions in a visiting role. My specialties include pricing and hedging of derivatives, especially on interest rates and credit risk, volatility smile modeling, credit valuation adjustments and funding costs, liquidity modeling, risk management, and recently optimal execution, algorithmic trading, dependence modeling. On the more abstract side I work in stochastic analysis and signal processing outside finance, using also differential geometric methods, but financial modeling is taking most of my energies. I joined full time academia in 2010 first as Gilbart professor at King's College London, and I am now a full professor and head of research group at Mathematics at Imperial College London, but I also work as director of the Capco institute in the industry. I think it's fair to say I'm one of the few people who really lives in the two worlds of full-fledged academia and the industry. Sometimes living in two worlds is exciting, sometimes it is frustrating.

JB: Counterparty Credit Risk is a topic which has received a great deal of attention since the financial crisis, both from the practitioner and academic communities. Would you mind explaining how the crisis impacted the field, and in the process tell us why you think that prior to the crisis it didn't receive quite the attention in deserved?

DB: It's actually funny because I started my work on valuation of CCR back in 2002 in the investment bank where I was working, but at the time people were not very interested in the topic in general. This was because credit spreads were relatively low, financial institutions were considered safe as counterparties, and overall the market was not worrying much about what is now called CVA. I presented several results in several places, inclusing MIT, and in fact I have been among the first to publish works proposing a rigorous arbitrage free valuation of counterparty risk (what is now called CVA). Interest was mild and I put part of this work in a drawer, to pull it back out in 2008. Sometimes it does not pay to be early. More generally I think there is a historical/cultural reason why CVA risk has not been recognized earlier. Many people who worked on CVA initially were coming from credit risk measurement and were using rough credit risk techniques also in delicate valuation problems. This has not helped appreciating some of the key risks properly, like wrong way risk for example.


JB: It seems that there have been a great many books published on these topics in the last few years. What makes your book stand out?

DB: There's probably 5 or 6 relatively technical books out there on CVA I'd say. I think our book is the most technical and rigorous, and the one that tries to appeal most to arbitrage free valuation. This is not always possible due to scarcity of data, data proxying, lack of liquid instruments for hedges, etc, but we have tried to save as much of the pricing theory as possible. I also think our book is the first to present a comprehensive framework for the valuation of funding costs together with credit risk and collateral modeling, what some people call Credit Valuation Adjustment (CVA), Debit Valuation Adjustment (DVA) and Funding Valuation Adjustment (FVA). There are no other books on this at the moment at the same level of rigor. 

Finally, we look at some details on counterparty risk valuation no other author has looked at before us, such as closeout modeling, first to default risk, detailed and precise analysis of wrong way risk, gap risk in collateral modeling, re-hypothecation of collateral, and a number of other subtleties in CVA and FVA. 

JB: How is your book structured, and why did you make those decisions?

DB: The book is structured in three parts, mostly following the appearance of new aspects of valuation in chronological order. Part 1 is given by a first chapter that is somehow an exception, as it is a colloquial introduction to all themes of the book, in dialogue form and with no formulas. Part 1 continues with two more chapters with basic definition and results on models for pure credit and default risk. We look both at structural and reduced form models. Part 2 is formed by 6 chapters where we look at unilateral CVA, namely CVA valued in case only the counterparty may default and not the calculating party (hence no DVA seen by the calculating party). This is the type of CVA that used to be computed up to 2008, and we did research on this since 2002, with advanced arbitrage free models and wrong way risk.  Well, advanced compared to what the rest of the industry was doing. We report this research and some more recent results, across a variety of asset classes including rates, commodites, equity, FX, and credit itself, and with detailed numerical examples, studying also closed form approximations based on LIBOR market model type analysis to account for netting. 

"...we had different opinions sometimes on some aspects, and this is visible in the initial and final colloquial dialogues in the book. I consider this a plus. You see, Massimo has been my PhD student and I helped recruiting Andrea in Banca IMI to work with me on credit modeling back in the early 2000's, but now I am learning a lot from both of them."

Part 3 consists in 10 chapters that look at advanced counterparty risk and funding costs modeling. We look at both CVA and DVA, collateral, gap risk, margining costs, funding costs, including advanced features such as closeout analysis, first to default risk, re-hypothecation of collateral, and related issues. We also explore the possible relationship between DVA and FVA. The part on funding costs highlights the new type of problems and nonlinearities these pose. We also added a bonus chapter on Longevity swaps valuation under CVA and FVA to show how these issue are relevant also for new types of asset classes. The book is concluded with a final dialogue that puts
the whole book in perspective and hints at further developments. So the structure is rather linear in time, with some summaries every now and then.

JB: The book is co-authored with Massimo Morini and Andrea Pallavicini. Did collaborating present any particular challenges? What other challenges did you face writing the book?

DB: Part of the challenge was keeping the team working together. Both Massimo and Andrea were quite busy at the bank and with other projects, so despite me being even busier I had to work as a sort of central awareness for the book development, keeping communications going, chasing deadlines etc, which is funny if you think that basically Andrea and Massimo were working in the same building in Milan, while I was in London. On the other hand a lot of material was ready in the form of previous papers, we had just to connect them, make notation consistent and create a narrative. Another challenge is that the discipline is evolving very rapidly and by the time we were done with a version of the book some new important problem we had not addressed would pop up. This prompted us to rewrite parts of the book a few times and delayed the publication date a few times. Finally, we had different opinions sometimes on some aspects, and this is visible in the initial and final colloquial dialogues in the book. I consider this a plus. You see, Massimo has been my PhD student and I helped recruiting Andrea in Banca IMI to work with me on credit modeling back in the early 2000's, but now I am learning a lot from both of them.

"I think the debate on funding costs needs to continue, are these rightly charged to clients or just a tool for internal cost analysis and profitability?"

JB: These are difficult topics, involving complex mathematics. Who is the book aimed at, and what level of expertise is required to understand the book?

DB: In fact the book starts with a quite long chapter, in the form of a
dialogue, that does not contain a single formula, and it closes in a similar way. I'd say the first 2/3 of the book are relatively standard in terms of methodology, and any market player with a basic quantitative education in option pricing and risk management should be able to follow. The third part is indeed more advanced but we provide several colloquial pointers to the relevance of our results and we do not hide behind formulas or mathematics. More generally I'd say that anyone with a standard quantitative education should be able to follow, in case integrating with some text on financial engineering or risk management, although we tried to keep the book as self contained as possible.

JB: In an evolving regulatory environment, it might seem that the goalposts for financial institutions are always changing. Is this true, and if so, how were you able to accommodate this in the book?


DB: Absolutely. As I hinted earlier we had to rewrite parts of the book a few times because of the changing environment. In fact the book could have come out in 2008 with a good analysis of CVA and wrong way risk, but then DVA became an issue, then collateral, closeout, re-hypothecation, Gap risk, then funding costs, now CCPs and standard CSA with margining implications, especially initial margins, and risk measurement of CVA rather than just pricing/hedging, and perhaps optimization of counterparty exposure. At some point we said: let's go out with what we have or we'll never publish it. We'll have to update it with a second edition soon probably, including initial margining and CCPs, but that is understandable, as you say regulation is evolving quite quickly and this will need updating. However the type of analysis, tools and reasoning that are already in the book will remain quite relevant also for these new problems, so it's natural to add a chapter in that it will be in line with the previous developments, while keeping the core part of the current book relevant.

JB: What are the 'Open Questions' in Counterparty Credit Risk, and what efforts are being made to address them?


DB: I think that the proper risk measurement of CVA, something like Value at Risk or Expected Shortfall of CVA, if properly done poses formidable computational challenges. We haven't seen any attempt to do this properly, but rather a set of simplifying technicques, including a number of suggestions from Basel, that leave a number of risks poorly addressed. There are also what some smart traders call zero order risks: lack of reliable data to deduce default probability of a counterparty,  lack of data to estimate recovery properly, the wrong

" ...regulators are keen to standardize procedures, but complex risks are impossible to standardize with a one-size-fits-it-all simple formula, multipliers, coefficients and tables. We need proper and possibly diverse modeling and regulators probably would be better off inspecting models developed by the industry or researchers rather than dictating a standard form for them."

practice of identifying the CVA recovery with the CDS recovery, missing data on credit volatility, difficulty in estimating credit correlation properly, taking into account portfolio rebalancing, and on and on. Data proxying will have to be developed, but mapping data from different situations is very challenging and often dangerous. Furthermore, I think the debate on funding costs needs to continue, are these rightly charged to clients or just a tool for internal cost analysis and profitability? And how about not just valuing credit risk and funding costs, but optimizing them in some sense, perhaps across the bank operations? How can the role of the bank treasury and the funding policies be taken into account more precisely? These questions prompt the necessity for quantitative analysts to become more eclectic and to study not only maths and statistics but also economics, policy, the functioning of the institutions where they work, etc. This will help.


JB: To what extent do you feel that advances in Counterparty Credit Risk can mitigate more general systemic risk in the financial markets?


  DB: Some of the key risks, such as wrong way risk and gap risk stemming from contagion, are very naturally related to systemic risk. Hence understanding a proper way to model such risks across large portfolios of netting sets may help illuminating the role of systemic risk. I should also add that inclusion of funding costs and credit risk brings a fundamental nonlinearity even in valuation, rather than risk, and therefore the aggregation level for valuation has to be decided a priori. I think it will be necessary, among other things, to run an analysis at the whole bank level, which brings us closer to issues such as systemic risk. One of the key problems is that the type of models used for precise analysis of small deals with a few risk factors do not scale to large parts of the financial system. The models are very different in nature and at large system level they are rather simplistic. There is not much understanding on whether the large scale models are in any way consistent with the small scale ones.


JB: How optimistic do you feel that the new regulatory environments and Basel III adequately address the issues raised by the financial crisis? What more can be done?

 DB: I am not pessimistic, but my optimism is quite limited. Basel still clings to a number of basic assumptions that are not very realistic, related to the whole risk weighted assets framework. Moreover, regulators are keen to standardize procedures, but complex risks are impossible to standardize with a one-size-fits-it-all simple formula, multipliers, coefficients and tables. We need proper and possibly diverse modeling and regulators probably would be better off inspecting models developed by the industry or researchers rather than dictating a standard form for them. Basel did have some good ideas, including addressing leverage, trading and funding liquidity ratios, but the basic premise is somehow flawed. One cannot standardize complex nonlinear and interconnected risks with simple formulas and tables.

One needs to do the hard work of modeling risk factors properly, connecting them, running a good calibration and running a complex simulation to assess risk, for example. Will Basel acknowledge this, rather than trying to impose simplistic standards? I am not sure. However a better and more extensive interaction between regulators and industry, and perhaps academic researchers, could help here.


JB: I'm quite interested in the way that new technology solutions are being brought to bear on these issues. Do you have a view on where the future lies for this field in terms of computational finance and methodologies?


DB: I think we are looking at methods for global valuation and analysis of larger and larger portfolios but with precise models for a huge number of underlying risk factors. This seems to suggest that analytical solutions will have to be replaced by numerical methods, especially simulation methods. Someone sees this as the end of stochastic calculus and quantitative models, I don't agree of course. The models calibration is an estimation problem and inverse problem  that will be impossible to implement without analytic shortcuts. Moreover, one still needs to understand the properties and the nature of the processes one is simulating, and to do this a degree of mathematical analysis will always be needed, even if the final calculation is numerical. Finally, study of convergence of methods will still require strong mathematics and asymptotics, again requiring research. It's definitely not the end of calculus and stochastic calculus, but it is a new era where we have a duty to understand how the traditional models scale in the big picture and how to deal with nonlinearities and contagion. Increase in computing power will be key. It is also important to point out that some risks are not parallelizzable, so the paradigm of parallel grids will not always be a good solution. We may need instead more and more powerful CPUs, which are luckily being developed. A key point may be that we will need to design models by taking into account, from the start, the type of computing machines we will have at our disposal, and their power. This is only partially done today, as many researchers first do the maths and then try to accomodate it into a machine, one way or another.
Overall, interesting times I'd say!

, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,