Friday, April 30, 2010

Hopf algebras, BSDEs, booms, crashes, and all that

I spent most of the week preparing for my trip to Europe today, so only kept an eye on what was going on in the thematic program at a distance, hence this all encompassing but not too detailed blog post.

Anke Wiese spoke on the visitors seminar on Tuesday about approximations of nonlinear systems of SDEs using more general schemes than the ones based on a stochastic Taylor expansion. She showed how a sinh-log series leads to truncation errors which are always better than a conventional Taylor series, as well as providing an opportunity to use Hopf algebras in finance, which is altogether very cool.

Nizar Touzi started his concentrated graduate course on stochastic control and BSDEs on Wednesday. He is going to teach for 6 hours every Wednesday for the next six weeks, so this promises to be one for the books (incidentally, he is typing his lectures notes, so it might even turn into a book...)

The April edition of the Quantitative Finance Seminar series had Jorge Sobehart from Citigroup talking about booms, crashes and market behavior. I had to miss it because of a faculty meeting at McMaster, which is a pity, since his talk fits right into my recent obsession with modeling bubbles in financial markets. But I guess that is where the audio recordings provided by Fields will come in handy.

Tuesday, April 27, 2010

Econometrics workshop

The third workshop of our thematic program took place at Fields last week. The focus was Financial Econometrics, and the organizers, led by Yacine Ait-Sahalia, treated us with a stellar line up of 32 speakers spread in two days of talks, including the third lecture in the Distinguished Lecture Series by Darrell Duffie.

As a result, each speaker (apart from Darrell) had to give a 20 minutes talk, which forced them to focus on the important results and cut back on introductory remarks or generalities. I can't think of a better way to provide a broad overview of a very dynamic area of research. The downside of course is that there is no way I can review every single talk as I did for the other two workshops, so I'll restrict myself to the following highlights:

- Robert Engle described a measure of the systemic risk associated with an individual bank. This was developed at NYU and is updated regularly and made available on the web for investor and regulators. I pointed out to him that he should change the name of measure from Marginal Expected Shortfall to Marginal Expected Systemic Shortfall, since MESS is a much better description of what goes on in the financial system. He can thank me for that when he gets his next Nobel Prize.

- Andy Lo proposed different levels of uncertainty, extending the well known distinction made by Knight between risk and uncertainty to a finer classification. His application of the idea to a physical problem (the harmonic oscillator), was entertaining, since ideas usually flow from physics to economics, but rarely the other way around.

- Jean Jacob gave an impeccable presentation using only chalk and blackboard and got a round of applause when he said that he never managed to learn powerpoint (that alone was worth attending the entire workshop).

Monday, April 26, 2010

Darrell Duffie, the Dark Lord

And by that I don't mean that Darrell replaced Lord Voldemort as the most powerful dark wizard of all times. What I do mean is that he is the person who knows the most about Dark Markets, as he demonstrated during the Distinguished Lecture Series at Fields last week.

In the first lecture Darrell described how over-the-counter markets differ from centralized ones, in particular with respect to the transfer of capital, which tend to be slow in the former, resulting in asset prices which can show a persistent deviation from "fundamentals". He also remarked prices for the same asset at the same time can show a large dispersion, since agents trade bilaterally, with no access to information that can reveal a unique "fair" price at the time of trade. By way of examples, he showed intriguing evidence from the time signature of prices for treasury bonds (that is, how they vary in time near the moment of issuance), as well as cross sectional dispersion in prices. Towards the end of the lecture, he commented on the benefits of clearing houses for derivative contracts.

Having laid the intuition for OTC markets, Darrell used his second lecture to explain an idealized mathematical model for a continuum of agents meeting for bilateral trades at random times according to a given intensity. Through a heavy use of infinite population, the law of large numbers, and independence, he was able to derive an evolution equation (a version of the Boltzman equation) for the "types" of agents in the population. Since at equilibrium bids and types are in a one-to-one correspondence, this evolution equation describes how information "percolates" in the population through an infinite series of double auctions.

In the third and final lecture, which took place during the Financial Econometric workshop, Darrell focused on the interbank market for Fed Funds and used a logit model to describe the probability of a transaction occurring between two banks and fitted the model to a data set comprising of 225 million observations for 8000 banks in 2005.

Friday, April 23, 2010

The Frittelli lectures

Marco Frittelli was here this week and gave two guest lectures in my graduate course. In his first lecture he reviewed the theory of risk measures, with particular emphasis on their representation in terms of the bi-conjugate functional given by the Fenchel-Moreau theorem. This leads naturally to the study of dual systems (other than the classical setting with bounded random variables and their dual set "ba"), and in particular Orlicz spaces. From there it was just a small leap into Banach lattices and the study of order continuous functionals. In the end it all came back to risk measures, for example by showing the the Fatou property is nothing but continuity below, etc.

The focus of his second lecture was utility maximization, where he showed that allowing for generally unbounded price processes and contingent claims naturally leads again to Orlicz spaces, in the sense that admissible portfolios have losses controlled by random variables that are compatible with the given utility function, and therefore belong to a specific Orlicz space associated with it. From there it doesn't take long to realize that the indifference price of a claim gives rise to a risk measure on an Orlicz space.

In short, the two lectures were a prime example of elegance and internal consistency, and a great way to conclude the graduate course. Apart from that, Marco also discovered the best cappuccino in Toronto. But since this information is valuable, you have to contact me personally to obtain it :)

Thursday, April 22, 2010

Goldman in my mind

Xianhua Peng spoke about default clustering and CDO valuation in the visitors seminar talk this week. I was able to follow the part of the presentation in which he reviewed the key literature on the subject, pointing out the main disadvantages of many approaches, in particular the problem of providing a good fit for both marginal default rates for each name in a CDO while at the same time having sufficiently strong correlations to fit the observed tranche prices. He then introduced his own model, in collaboration with Steve Kou, who was his PhD advisor at Columbia, based on cumulative default intensities.

At the point the talk got a bit too technical for me, so when he mentioned Goldman Sachs I let my mind drift. For the record, I think the SEC is weak and was pushed in a hurry to create momentum for FinReg reform. While not necessarily a bad thing if it achieves the purpose of passing reform, I think this strategy creates the wrong type of outrage.  

Sunday, April 18, 2010

Credit-Hybrid risk Forum

The third in our series of Industrial-Academic forums took place at the end of last week at Fields. I didn't attend many of the talks, because of a combination of train delays, family appointments and being busy looking after Robert Merton, so this post is not as detailed as I would normally have liked.

A quick glance at the program reveals that counterparty risk in general, and CVA in particular, was the prominent theme of the forum, followed closely by game options. This made for a very diverse forum, since the former is mostly grounded in practical day-to-day considerations for both banks and regulators (case in point: how to deal with so called "wrong-way risk", which doesn't strike me as a groundbreaking theoretical question, but somehow gets everybody else excited), whereas the latter is as theoretical as it gets (case in point: as Jan Kallsen showed, game options incorporate both European and American options as special cases, and pricing and hedging them quickly leads one to think deep about the limitations of arbitrage and replication, utility-based approaches, and so on).

A third pillar for the forum was the joint modeling of equity and credit markets, which sits between the other two in the theory-practice spectrum. For example, I missed Claudio Albanese's talk but understood from the comments and discussion that he is agnostic about models, but extremely concerned about the computational paradigms
(third and forth level BLAS, whatever that means) that are necessary to calibrate them to all available credit and equity derivatives. On the other hand, Tom Hurd, Julien Turc and Rafael Mendoza-Arriaga all have subtly different ways to jointly model credit and equity, ultimately relying on a deeper understanding of the capital structure of a company.

Saturday, April 17, 2010

Merton and me

Robert Merton delivered the The Nathan and Beatrice Keyfitz Lectures in Mathematics and the Social Sciences this past Thursday to a delighted audience of about 300 people who packed the large auditorium at the Bahen Center. He touched on many important aspects of the financial crisis, including the structural challenges posed by the intrinsic "put option" embedded in any risky loan, the role of composition when seemingly prudent actions taken by individuals in isolation lead to large systemic risks, and the limitations of mathematical models, which in his view cannot be judged separately from their users and applications. Contrary to populist clamors for "common sense", he said that the crisis accentuates the need for more quantitative research in finance, since none of these problems will go away by a magical return to "simplicity" in the financial world. Needless to say, I agree with all that, otherwise we would not have gone ahead with a thematic program on this subject.

Apart from the lecture itself, I had the privilege to host him at Fields during the afternoon and to take him out for dinner afterward together with other distinguished guests associated with the Institute. We were unanimously impressed by how engaging he was, not only by being ready to share personal experiences from the vantage point he occupied for the past 40 years, but also being genuinely interested in the research ideas that we timidly put forward for discussion.

At least for me, it was undoubtedly the high moment of the program so far, the kind of stuff that makes it all worthwhile.

Wednesday, April 14, 2010

Monte Carlo for PDEs on steroids

Fresh from his PhD defense - itself a memorable event that took place at Fields last week, with a star-filled examining committee consisting of two professors transplanted from France, one from Iran and two from Canada - Arash Fahim used the visitors seminar this week to describe the contents of his thesis to the rest of us.

Together with Nizar Touzi and others, he developed a probabilistic scheme to numerically approximate the solution of a certain type of fully nonlinear PDE. This generalizes the well-known use of Monte Carlo to solve linear parabolic PDEs through the Feymann-Kac formula, as well as the semi-linear case, which corresponds to approximating the solution of a BSDE. In the fully nonlinear case one is lead to consider the so-called 2BSDEs, as well as clever ways to approximate derivatives inside expectations, leading to what Nizar likes to call a Monte Carlo/Finite Differences scheme.

Arash showed several convincing numerical examples, but it is clear that this is a vast area with plenty of room for a lot more work.

Tuesday, April 13, 2010

Energy forum

I'm a bit late in commenting on the forum on Commodities, energy markets and emissions trading that took place at Fields last week. Like the Operational Risk forum last month, this was another great learning opportunity for a outsider like myself. So in the same spirit of my blog post on that forum, here is a summary of what I learned from this one:

- commodity markets are fertile ground for exotic derivatives, because physical constraints (say the rate at which you can pump oil in and out of a tanker) influence the design and valuation of even the most basic contracts;

- most things (contracts, hedging strategies, etc) depend on spreads, consequently correlations (between dates, locations, type of fuel, etc) play an essential role. It becomes important to know when one can "Margrabe the world away";

- different players in the market (say producers and retailers) have different likes and dislikes (say towards spikes in prices and their persistence), so the story behind the change of measure from physical to risk-neutral probabilities is an elaborate one (for example, both the long-term level and the speed of mean reversion might change from one measure to another);

- enforcing consistency constraints while modeling and fitting implied volatility curves and surfaces across time is as difficult a problem for commodities as it is for interest rates, and perhaps might benefit from common techniques;

- there exist reasonably advanced models for carbon emission markets and they perform relatively well when compared to actual data for the first phase of the EU-ETS market, whereas a lot needs to be done to model newer features in these markets (multi-periods, banking of allowances, CERs, etc)

- electricity markets are very developed and well understood both in theory and practice. As a consequences, linking them to carbon emission markets in a unified way might shed some light.

The forum concluded with an extremely lively panel (sadly not available in audio recording). I tried my best to stir the discussion towards the benefits and flaws of carbon emission markets, but what really caught the attention of most panelists and people in the audience was the relation between academia and industry. The highlight for me was Nicole El Karoui standing up at the back of the room and delivering a passionate defense (in French) of the role of independence of academic research. She said that although she learned important lessons from the markets (for example the importance of model robustness, something that academics seldom think about), we need to keep a healthy distance and reject the notion that markets are always right. Bravo Nicole !

Friday, April 9, 2010

Coxeter Lectures

Nicole El Karoui delivered the Coxeter Lectures Series for our thematic program this week. The theme of the lectures were backward stochastic differential equations (BSDEs), a vast and deep topic to which she has made groundbreaking contributions over the past couple couple of decades.

The first lecture was an overview of the main definitions, results and classical applications to finance, along the lines of her well-known 1997 paper with Peng and Quenez. It went about half an hour overtime, but she is so passionate about the subject that nobody minded much.

I could not attend the second lecture (was organizing a "help your child with math" event at my son's school the same evening), but was told by other participants that it covered the more mathematical aspects of the theory, showing what goes under the hood when one tries to prove things like existence, uniqueness and stability of solutions of BSDEs.

In the third and final lecture, Nicole concluded a discussion of numerical schemes for approximating solutions of BSDEs by showing what kind of errors need to be controlled, together with a few convergence results. She then switched gears to finance again and described how certain BSDEs can be viewed as dynamic risk measures - a challenging new focus of intense research in the area.

Wednesday, April 7, 2010

All you ever wanted to know about Levy processes

Well, maybe not all, but pretty close...

Alexey Kuznetsov gave the visitors seminar this week on the theme above. He showed how to efficiently compute the probability distributions for all sorts of functionals (minimum, maximum, first passage time, last maximum before a first passage time, time of the last maximum before a first passage time, etc...) for a class of Levy processes that he says is as natural as CGMY, but with much nicer analytic properties.

I knew of Alexey's  fascination with Levy processes (among other things) ever since his first postdoc at McMaster several years ago. He has become a world expert on computing the kind of functionals mentioned above, engaging in collaboration with probability heavy-weights such as Andreas E. Kyprianou. A pleasure to watch.

Thursday, April 1, 2010

Quantitave Finance Seminars - March edition

We celebrated the end of the first half of our thematic program with the March installment of the Quantitative Finance Seminar Series .


Dilip Madan is thinking hard about the unintended consequences of the limited liabilities paradigm, the very foundation of publicly traded firms, in the presence of financial practices that lead to unbounded exposures to risk (think naked short sales by hedge funds). As he explained, this implicit leads to the appearance of a "taxpayer put", which can be extremely valuable (of the orders of hundreds of billions of dollars for major US banks) and yet are entirely absent from traditional valuations of financial institutions. His proposal is to introduce a capital requirement that off-sets the perverse incentives of this implicit put option. As FinReg  marches through the US Congress, nothing could be more topical.


Stan Uryasev based his entire talk in what he called a "quadrangle" with risk, deviation, regret and error in each of four corners and the arrows linking them representing specific procedures to go from one corner to another (things like minimization, taking expectations, etc). Just in case you find this too abstract, he showed how it all works for VaR and CVaR, and also advertised his own company, which runs funds based on the algorithm he described. Later during dinner I observed that one of his funds had over 100% return in 2008, which was better than Renaissance, which posted about 80% return that year. Dilip then remarked that this means very little, since everyone made a lot of money in 2008. Ok...