Seminar on Financial and Actuarial Mathematics

The seminar is organized by Prof. Biagini, Prof. Czado, Prof. Klüppelberg, Prof.  Meyer-Brandis, Prof. Scherer, Prof. Svindland and Prof. Zagst.  The venue of the seminar changes on a regular basis between the  TUM (Garching, Business Campus, Parkring 11) and the Mathematical Institute of the LMU (München, Theresienstraße 39).

Currently the Seminar takes place at the Technical University of Munich, Business Campus, Parkring 11, Garching, Room No. BC1 2.02.01  (on Mondays, 14:15 to 17:00)

The dates of the Graduate Seminar in Financial and Actuarial Mathematics (WiSe 2018/2019) are:

  • November 5, 2018
  • December 3, 2018
  • January 28, 2019

Upcoming talks

10.05.2019 10:00 Dr. Ingo Kraus, Prof. Dr. Damir Filipovic, Prof. Dr. Antoon Pelsser, Prof. Dr. Ralf Werner, Dr. Oleksandr Khomenko: Workshop on Replication in Life Insurance: In cooperation with ERGO Group

Previous talks


Traditionally, public pension schemes, organized in a social security framework, use a pay as you go technique (PAYG); from the benefit point of view, they are based on a Defined Benefit (DB) or a Defined Contribution (DC) approach. This dichotomy follows two extreme philosophies of risk spreading between the stakeholders: in DB, the organizer of the plan bears the risks; in DC (including the Notional accounts – NDC), the affiliates must bear the risks. Especially applied to social security, this traditional polar view can lead to unfair intergenerational equilibrium in both cases. The purpose of this presentation is to propose, in PAYG, alternative hybrid architectures based on a mix between DB and DC, in order to achieve simultaneously financial sustainability and social adequacy in a stochastic environment. Using different stochastic models for the risk factors, we propose different levels of optimality in terms of architecture of the pension scheme.

28.01.2019 15:00 Axel Bücher: Extreme Value Analysis of Multivariate Time Series: Multiple Block sizes and Overlapping Blocks

The core of the classical block maxima method in (multivariate) extreme value statistics consists of fitting an extreme value distribution to a sample of maxima over blocks extracted from an underlying time series. Traditionally, the maxima are taken over disjoint blocks of observations of a fixed size. Alternatively, the blocks can be chosen to be of varying size and to slide through the observation period, yielding a larger number of overlapping blocks. Nonparametric estimation of extreme value copulas based on sliding blocks is found to be more efficient than estimation based on disjoint blocks.

28.01.2019 16:00 Jean-David Fermanian: On Kendall's regression

Conditional Kendall's tau is a measure of dependence between two random variables, conditionally on some covariates. We assume a regression-type relationship between conditional Kendall's tau and some covariates, in a parametric setting with a large number of transformations of a small number of regressors. This model may be sparse, and the underlying parameter is estimated through a penalized criterion. We prove non-asymptotic bounds with explicit constants that hold with high probabilities. We derive the consistency of a two-step estimator, its asymptotic law and some oracle properties. We show how the problem of estimating conditional Kendall's tau can be rewritten as a classification task. We detail specific algorithms adapting usual machine learning techniques, including nearest neighbors, decision trees, random forests and neural networks, to the setting of the estimation of conditional Kendall's tau. Finite sample properties of these estimators and their sensitivities to each component of the data-generating process are assessed in a simulation study. Finally, we apply all these estimators to a dataset of European stock indices.

03.12.2018 14:15 Miguel de Carvalho: Nonstationary Joint Extremes

In this talk, I will discuss key ideas on time-changing extremal dependence structures. Extremal dependence between international stock markets is of particular interest in today’s global financial landscape. However, previous studies have shown this dependence is not necessarily stationary over time. We concern ourselves with modeling extreme value dependence when that dependence is changing over time, or other suitable covariate. Working within a framework of asymptotic dependence, we introduce a regression model for the angular density of a bivariate extreme value distribution that allows us to assess how extremal dependence evolves over a covariate. We apply the proposed model to assess the dynamics governing extremal dependence of some leading European stock markets over the last three decades, and find evidence of an increase in extremal dependence over recent years.

03.12.2018 15:00 Matti Kiiski : Pathwise Pricing-Hedging Duality

We discuss weak topologies on the Skorokhod space of cadlag functions. In particular, we study the weak* topology they induce on the family of probability measures on the canonical space and give applications to the pathwise pricing-hedging duality. We also discuss related open problems.

03.12.2018 16:00 Miriam Isabel Seifert: Financial risk measures for a network of individual agents holding portfolios of light-tailed objects

In this talk we consider a financial network of agents holding portfolios of independent light-tailed risky objects with losses assumed to be asymptotically exponentially distributed with distinct tail parameters. The derived asymptotic distributions of portfolio losses refer to the class of functional exponential mixtures. We also provide statements for Value-at-Risk and Expected Shortfall measures as well as for their conditional counterparts. We establish important qualitative differences in the asymptotic behavior of portfolio risks under light tail assumption compared to heavy tail settings which should be accounted for in practical risk management. (joint work with Claudia Klüppelberg)

05.11.2018 14:15 Thorsten Rheinländer : On the stochastic heat equation with mutiplicative noise

We study a parsimonious but non-trivial model of the latent limit order book where orders get placed with a fixed displacement from a center price process, i.e. some process in-between best bid and best ask, and get executed whenever this center price reaches their level. This mechanism corresponds to the fundamental solution of the stochastic heat equation with multiplicative noise for the relative order volume distribution, for which we provide a solution via a local time functional. Moreover, we classify various types of trades, and introduce the trading excursion process which is a Poisson point process. This allows to derive the Laplace transforms of the times to various trading events under the corresponding intensity measure.

05.11.2018 15:00 Gonçalo dos Reis: Large Deviations for McKean Vlasov Equations and Importance Sampling

We discuss two Freidlin-Wentzell large deviation principles for McKean-Vlasov equations (MV-SDEs) in certain path space topologies. The equations have a drift of polynomial growth and an existence/uniqueness result is provided. We apply the Monte-Carlo methods for evaluating expectations of functionals of solutions to MV-SDE with drifts of super-linear growth. We assume that the MV-SDE is approximated in the standard manner by means of an interacting particle system and propose two importance sampling (IS) techniques to reduce the variance of the resulting Monte Carlo estimator. In the "complete measure change" approach, the IS measure change is applied simultaneously in the coefficients and in the expectation to be evaluated. In the "decoupling" approach we first estimate the law of the solution in a first set of simulations without measure change and then perform a second set of simulations under the importance sampling measure using the approximate solution law computed in the first step.

02.07.2018 15:00 Birgit Rudolf, Wien: Time consistency of the mean-risk problem

Multivariate risk measures appear naturally in markets with transaction costs or when measuring the systemic risk of a network of banks. Recent research suggests that time consistency of these multivariate risk measures can be interpreted as a set-valued Bellman principle. And a more general structure emerges that might also be important for other applications and is interesting in itself. In this talk I will show that this set-valued Bellman principle holds also for the dynamic mean-risk portfolio optimization problem. In most of the literature, the Markowitz problem is scalarized and it is well known that this scalarized problem does not satisfy the (scalar) Bellman principle. However, when we do not scalarize the problem, but leave it in its original form as a vector optimization problem, the upper images, whose boundary is the efficient frontier, recurse backwards in time under very mild assumptions. I will present conditions under which this recursion can be exploited directly to compute a solution in the spirit of dynamic programming and will state some open problems and challenges for the general case. (Joint work with Gabriela Kováčová)

02.07.2018 16:00 Dr. Nils Detering (UCSB): An Integrated Model of Fire Sales and Default Contagion in Financial Systems

In (Detering et. al, 2016) and (Detering et. al, 2018) we developed a random graph model for 'default contagion' in financial networks and using 'law of large numbers' effects we were able to compute the size of the final default cluster induced by an arbitrarily given initial shock in a large system. Further, we were able to derive sufficient and necessary criteria for resilience of a system to small shocks. In that sense our model provides better insights than the popular Eisenberg-Noe model which is concerned with existence and uniqueness of a clearing vector (and hence the final state of the system) but gives no indication of favorable network structures and sufficient capital requirements to ensure resilience. The Eisenberg-Noe model, however, has proven to be flexible enough to be extended by contagion channels other than default contagion, the most important being 'fire sales' which describes contagion effects due to falling asset prices as institutions sell off their assets. In this article, we first propose a model for fire sales that uses an Eisenberg-Noe like description for finite networks but allows to describe the final state of the system (size of the default cluster and final price impact) asymptotically. In particular, we are able to provide sufficient capital requirements that ensure resilience of the system. Furthermore, we integrate the channel of default contagion into our model applying results from (Detering et. al, 2018) and extending them to the non-continuous case induced by the fire sales. Finally, for this integrated setting, we provide criteria that determine whether a certain financial system is resilient or prone to small initial shocks and furthermore give sufficient capital requirements for financial systems to be resilient. This is joint work with Thilo Meyer-Brandis, Konstantinos Panagiotou and Daniel Ritter

11.06.2018 14:15 Erick Trevino Aguilar, Universidad de Guanajuato: Integral functionals of cadlag processes and partial superhedging of American options

In this talk we present advances in convex analysis and obtain a novel interchange rule for convex functionals defined over cadlag processes. This interchange rule allows to develop convex duality for a rich class of convex problems in general stochastic settings and requires a careful analysis of set valued mappings and its cadlag selections. As an application, we develop the dual problem of American option's partial hedging.

07.05.2018 14:15 Prof. Michael Ludkowski: Marrying Stochastic Control and Machine Learning: from Bermudan Options to Natural Gas Storage and Microgrids

Simulation-based strategies bring the machine learning toolbox to numerical resolution of stochastic control models. I will begin by reviewing the history of this idea, starting with the seminal work by Longstaff-Schwartz and through the popularized Regression Monte Carlo framework. I will then describe the Dynamic Emulation Algorithm (DEA) that we developed, which unifies the different existing approaches in a single modular template and emphasizes the two central aspects of regression architecture and experimental design. Among novel DEA implementations, I will discuss Gaussian process regression, as well as numerous simulation designs (space-filling, sequential, adaptive, batched). The overall DEA template is illustrated with multiple examples drawing from Bermudan option pricing, natural gas storage valuation, and optimal control of back-up generator in a power microgrid. This is partly joint work with Aditya Maheshwari (UCSB).

07.05.2018 15:00 Dr. Tobias Kley, Berlin: Quantile-Based Spectral Analysis of Time Series

Classical methods for the spectral analysis of time series account for covariance-related serial dependencies. This talk will begin with a brief introduction to these traditional procedures. Then, an alternative method is presented, where, instead of covariances, differences of copulas of pairs of observations and the independence copula are used to quantify serial dependencies. The Fourier transformation of these copulas is considered and used to define quantile-based spectral quantities. They allow to separate marginal and serial aspects of a time series and intrinsically provide more information about the conditional distribution than the classical location-scale model. Thus, quantile-based spectral analysis is more informative than the traditional spectral analysis based on covariances. For an observed time series the new spectral quantities are then estimated. The asymptotic properties, including the order of the bias and process convergence, of the estimator (a function of two quantile levels) are established. The results are applicable without restrictive distributional assumptions such as the existence of finite moments and only a weak form of mixing, such as alpha-mixing, is required.

07.05.2018 16:00 Dr. Gregor Kastner, Wien: Bayesian Time-Varying Covariance Estimation in Many Dimensions using Sparse Factor Stochastic Volatility Models

We address the curse of dimensionality in dynamic covariance estimation by modeling the underlying co-volatility dynamics of a time series vector through latent time-varying stochastic factors. The use of a global-local shrinkage prior for the elements of the factor loadings matrix pulls loadings on superfluous factors towards zero. To demonstrate the merits of the proposed framework, the model is applied to simulated data as well as to daily log-returns of 300 S&P 500 members. Our approach yields precise correlation estimates, strong implied minimum variance portfolio performance and superior forecasting accuracy in terms of log predictive scores when compared to typical benchmarks. Furthermore, we discuss the applicability of the method to capture conditional heteroskedasticity in large vector autoregressions.