Talks in Financial and Insurance Mathematics

Main content

This is the regular weekly research seminar on Insurance Mathematics and Stochastic Finance.

Autumn Semester 2010

Note: The highlighted event marks the next occurring event and events marked with an asterisk (*) indicate that the time and/or location are different from the usual time and/or location.

Date / Time Speaker Title Location
16 September 2010
Thorsten Schmidt
TU Chemnitz
Existence and positivity in CDO term structures  HG G 43 
Abstract: Starting from a model for dynamical term structures of CDOs, conditions for absence of arbitrage reveal nested stochastic partial differential equations which respect a particular monotonicity. In a first part study conditions for existence of such models and in the second part we derive general conditions such that the required monotonicity holds. Besides being applicable to CDOs, similar questions also arise in models with ratings. This is joint work with Stefan Tappe.
30 September 2010
Josef Teichmann
ETH Zürich
A semigroup point of view on splitting schemes for S(P)DEs with applications to interest rate theory  HG G 43 
Abstract: We present a functional analytic framework which allows to prove precise rates of convergence for splitting schemes applied to unbounded payoffs in a general SPDE framework. Applications to HJM equations, general Da Prato-Zabczyk SPDEs and -- up to some extend -- to 2D stochastic Navier-Stokes equations are given.
7 October 2010
Keita Owari
Hitotsubashi University, Japan
On the duality in robust utility maximization with unbounded claim  HG G 43 
Abstract: We address the robust utility maximization with a contingent claim as a random endowment, with emphasis on the fundamental duality equality. When the underlying price process is a locally bounded semimartingale under the reference probability, we show that the duality holds true for a wide class of utility functions and unbounded claims. After some reductions, the key step in our proof is the analysis of robust utility functional defined by a random utility function and its conjugate. We shall do this by (partially) extending the classical Rockafellar theorem on convex integral functionals in a robust way. Then the duality in consideration follows from a well-known theorem of Fenchel.
* 14 October 2010
Richard Verrall
Cass Business School, London, England
Reversible jump Markov chain Monte Carlo methods for claims reserving in general insurance  HG G 19.2 
Abstract: Stochastic methods for standard claims reserving models have become quite familiar to the insurance industry. These are useful for solvency requirements and capital modelling. However, when setting reserves, these are often adjusted in an ad hoc way, which presents a challenge for the formal statistical methods. RJMCMC methods provide a way of replacing ad hoc procedures with automatic methods to produce the desired results within a coherent statistical framework.
14 October 2010
Glenn Meyers
ISO Innovative Analytics
What EU Insurers Could Do if They Had Schedule P  HG G 19.2 
Abstract: The goal of this paper is to demonstrate how publicly available data can be used to calculate the technical provisions in Solvency II. This is a purely hypothetical exercise, since the publicly available data is in America, and Solvency II applies to the European Union. Using American Schedule P data, this paper: 1. Develops "prior information" to be used in an empirical Bayesian loss reserving method. 2. Uses the Metropolis-Hastings algorithm to develop a posterior distribution of parameters for a Bayesian Analysis. 3. Develops a series of diagnostics to assess the applicability of the Bayesian model. 4. Uses the results to calculate the best estimate and the risk margin in accordance with the principles underlying Solvency II. 5. Develops an ongoing process to regularly compare projected results against experience. The paper includes analyses of the Schedule P data for four American Insurers based on its methodology. The paper can be downloaded from the CAS E-Forum at: My talk at ETH will contain some updates to the analysis contained in the paper.
21 October 2010
Wolfgang Arendt
University of Ulm
From Forms to Semigroups: an Efficient Method for Degenerate Elliptic Operators  HG G 43 
Abstract: In the talk we will explain the methods of sesquilinear forms for the solution of evolutionary problems. There are some new features and results obtained in a joint papers with Tom ter Elst (Sectorial forms and degenerate differential operators, J. Oper. Th. to appear) which are particularly efficient for degenerate equations. The Black Scholes Equation is one example. One point is that we do not need the topological condition of closability. A purely algebraic condition on the form suffices in order to associate to it a generator of a holomorphic semigroup. Still one can establish interesting properties of the induced semigroups, for example positivity, the submarkov property and others. This is done with the help of a simple criterion for the invariance of a convex set under the semigroup. The most surprizing example for our method is the Dirichlet-to-Neumann operator. This might not yet have applications to financial mathematics though.
28 October 2010
Walter Schachermayer
University of Vienna
Law invariant risk measures for vector-valued portfolios  HG G 43 
Abstract: Kusuoka (2001) has obtained explicit representation theorems for comontone risk measures and, more generally, for law invariant risk measures. These theorems pertain, as usual in most of the previous literature, to the case of scalar-valued risks. Burgert-Rüschendorf (2006) and Jouini-Meddeb-Touzi (2004) extended the notion of risk measures to the vector-valued case. Recently Ekeland-Galichon-Henry (2009) obtained extensions of the above theorems of Kusioka to this setting. Their results were confined to the regular case. In general, Kusuoka's representation also involves a singular part. In the present work we give a full generalistaion of Kusuoka's theorems to the vector-valued case. The proofs are based on some extensions of Brenier's theorem on mass transport in a finite dimensional Hilbert space. This is joint work with Ivar Ekeland.
* 4 November 2010
Harry Zheng
Imperial College London
Approximate Basket Options Valuation for Jump-Diffusion Models  HG E 23 
Abstract: In this talk we will discuss the approximate basket options valuation for jump-diffusion models. We suggest two new approximate pricing methods. One is the weighted sum of Roger and Shi's lower bound and the conditional second moment adjustments. The other is the smart expansion to approximate the conditional expectation of the stochastic variance associated with the basket value process. The numerical tests show that the suggested methods are fast and accurate in comparison with the Monte Carlo and other methods in most cases.
4 November 2010
Aurelien Alfonsi
CERMICS, Paris, France
Exact and high order discretization schemes for Wishart processes and their affine extensions  HG G 43 
Abstract: This work deals with the simulation of Wishart processes and affine diffusions on positive semidefinite matrices. To do so, we focus on the splitting of the infinitesimal generator, in order to use composition techniques. Doing so, we have found a remarkable splitting for Wishart processes that enables us to sample exactly Wishart distributions, without any restriction on the parameters. It is related but extends existing exact simulation methods based on Bartlett's decomposition. Moreover, we can construct high-order discretization schemes for Wishart processes and second-order schemes for general affine diffusions. These schemes are in practice faster than the exact simulation to sample entire paths. Numerical results on their convergence are given.
11 November 2010
Andrea Macrina
King's College London
Filtering Models for Asset Pricing  HG G 43 
Abstract: Information-based asset pricing is formulated in the standard language of stochastic filtering theory. The X-factors used to model the cash flows in the information-based framework are interpreted as time-independent system processes, and the associated information processes play the role of observation processes. Given that the information processes generate the market filtration, it turns out that the conditional probability of the X-factors satisfy the Kushner-Stratonovich equation. In this presentation, apart from considering Brownian bridge information, we also show how other types of observation processes can be applied to develop information-based asset pricing models in a unified way. (Joint work with Jun Sekine, Osaka University)
18 November 2010
Jan Beran
Universitaet Konstanz
Estimating Strong Dependence in Volatility - Problems and Solutions, Exemplified by ML-type Estimation for LARCH Processes  HG G 43 
Abstract: One of the well known "stylized facts" in financial time series is strong (or long-range) dependence in conditional volatilities. In spite of an extended literature the question how to capture this type of heteroskedasticity (as well as related leverage effects) by statistically feasible models is still widely unexplored. Difficulties and possible solutions that emerge in this context will be exemplified by considering (quasi) maximum likelihood estimation (QMLE) for LARCH processes. In the course of developing an asymptotic theory, questions regarding generalized differentiability and ergodicity have to be answered. To ensure consistency, asymptotic normality and computability, the initial conditional QMLE has to be modified, at the cost of a slower rate of convergence. This is joint work with Martin Schützner.
* 25 November 2010
Apostolos Fertis
ETH Zürich
Robust Risk Management  HG G 19.2 
Abstract: Coherent risks can be expressed as the worst-case expectation when the probability distribution varies in some uncertainty set, according to the representation theorem. Very often, randomness can be divided in two stages, and there is additional information about the possible first stage scenarios. Traditional coherent risks, such as the CVaR, fail to make use of this information. In this talk, we introduce a new class of risk measures, called robust risk measures, which combine the uncertainty set of a traditional risk measure with the additional information about the first stage scenarios. We state and prove a representation theorem for the robust risk measures, which facilitates their computation. We define and show how to compute the Robust CVaR, the robust risk constructed based on CVaR. We compare the optimal Robust CVaR and optimal-CVaR portfolios under diverse scenarios constructed using real New York Stock Exchange (NYSE) and NASDAQ data from 2005 to 2010.
25 November 2010
Georg Mainik
ETH Zürich
Diversification Effects for Extreme Risks: Characterization, Estimation, and Ordering  HG G 43 
Abstract: The central topic of this talk is the diversification of catastrophic losses. Under the assumption of multivariate regular variation, the asymptotic portfolio loss distribution is characterized by a functional of the portfolio weights, the tail index, and the so-called spectral measure representing the dependence structure in the tail region. Further results encompass the general properties of the optimization problem, the estimation of the portfolio risk functional, and the ordering of models with respect to the asymptotic behaviour of portfolio losses. Particular interest is paid to the possibility of negative diversification effects, to the uniform convergence of estimates, and to the ordering criteria in terms of canonical spectral measures and copulas.
2 December 2010
Thorsten Hens
Swiss Banking Institute, University of Zurich
Three Solutions of the Pricing Kernel Puzzle  HG G 43 
Abstract: The pricing kernel is an important link between economics and finance. Standard models of financial economics are based on three assumptions: Markets are complete and investors are risk-averse and have common and true beliefs. Consequently, the pricing kernel is a decreasing function of aggregate resources. However, there is ample empirical evidence that the pricing kernel has some increasing parts, which is the so called pricing kernel puzzle. We first show that neither of the three assumptions is needed for the pricing kernel to be generally decreasing and we show then that if at least one of the three assumptions is violated, the pricing kernel can have increasing parts. Thus, incomplete markets, risk-seeking behaviour and incorrect beliefs can be seen as three potential solutions to the pricing kernel puzzle. We verify the robustness of the explanations under aggregation and compare the phenomena with the findings in the empirical literature. The results are used to reveal strengths and weaknesses of the three solutions.
* 9 December 2010
Archil Gulisashvili
Ohio University
Asymptotic formulas for the stock price distribution density and the implied volatility  HG G 19.2 
Abstract: In the talk, sharp asymptotic formulas for the stock price density will be discussed for several classical stochastic volatility models (Hull-White, Stein-Stein, and Heston). It will also be explained how the Black-Scholes implied volatility behaves in general stochastic stock price models. Sharp asymptotic formulas for the implied volatility, obtained by the author, imply various known results, for instance, Lee's moment formulas and the tail-wing formulas due to Benaim and Friz. Another topic that will be presented in the talk is Piterbarg's conjecture. This conjecture concerns the exceptional cases in Lee's moment formulas. It will be shown that Piterbarg's conjecture holds in a modified form. We will also provide a necessary and sufficient condition for the existence of the limit in Lee's moment formulas.
9 December 2010
Matheus Grasselli
McMaster University, Hamilton, Canada
The priority option: the value of being a leader in complete and incomplete markets  HG G 43 
Abstract: In a recent paper, Bensoussan, Diltz and Hoe (2010) provide a comprehensive analysis of optimal investment strategies under uncertainty and competition. They consider two firms competing for a project whose payoff can be either a lump-sum or a series of cash-flows, in both complete and incomplete markets. Despite its generality, the analysis is restricted to a Stackelberg game, where the roles of leader and follower are predetermined. In this talk, I'll extend the analysis to the case where these roles emerge as the result of a symmetric, Markov, sub-game perfect equilibrium, extending the seminal work of Grenadier (1996) and (2000) to incomplete markets. As a result, one can calculate the amount of money that a firm would be willing to spend in advance (either by paying a license or acquiring market power) to have the right to be the leader in a subsequent game - what we call the priority option. (This is joint work with Vincent Lecrere)
16 December 2010
Michael Merz
University of Hamburg
Credibility Prediction in Continuous Time  HG G 43 
Abstract: Credibility theory is a well-known method to derive linear predictors for the calculation of fair insurance premiums based on individual and collective claims information. However, recently credibility theory has also found applications in other areas of risk management such as stochastic claims reserving and operational risk. In this talk we introduce credibility models and derive credibility predictors in a continuous time setting by means of stochastic linear filtering theory. This approach leads to interesting credibility predictors. Some of these predictors can be understood as the continuous counterparts of credibility predictors in classical discrete time credibility models such as the Hachemeister, Bühlmann-Straub and Bühlmann model and the other ones do not have a discrete counterpart but possess some attractive additional features.
20 December 2010
Kay Giesecke
Stanford University
Exploring the sources of default clustering  HG G 19.2 
Abstract: We develop filtered maximum likelihood estimators and goodness-of-fit tests for event timing models in which the arrival intensity is influenced by past events and time-varying explanatory covariates, some of which cannot be measured. Applying these tools to default events of US firms between 1970 and 2010, we find that the response of the intensity to defaults is economically and statistically significant, after controlling for the influence of the macro-economic covariates that prior studies have identified as predictors of US defaults, and for the role of an unobservable frailty risk factor whose importance for US default timing was recently established. Both frailty and contagion, by which the default by one firm has a direct impact on the health of other firms, are significant sources of default clustering, over and above any correlation caused by firms' joint exposure to observable risk factors. Joint work with Shahriar Azizpour and Gustavo Schwenkler

Archive: SS 17  AS 16  SS 16  AS 15  SS 15  AS 14  SS 14  AS 13  SS 13  AS 12  SS 12  AS 11  SS 11  AS 10  SS 10  AS 09 

Page URL:
Wed Jun 28 15:53:29 CEST 2017
© 2017 Eidgenössische Technische Hochschule Zürich