My research lies at the interplay between actuarial science and quantitative finance. From the study of stochastic processes to the statistical analysis of real-world problems.
Here are some recent topics:
Current and recent projects
Selection of an appropriate longevity model and detection of abrupt changes in longevity dynamics is critical for the insurance industry.
In the first part of this project, we investigate the model choice issue by the use of regularization and bayesian statistics.
- In the first paper, we propose to smooth and forecast the mortality surface by the use of regularization. Regularization is a great tool to reduce overfitting and we show that this approach outperforms the P-splines method.
- In the second paper, we propose two Bayesian model averaging techniques based on leave-future-out validation. We show that such approaches tend to outperform the standard BMA in terms of robustness and accuracy.
In the second part of the project, we tackle the change-point detection problem for longevity trends and for multi-sensors (corresponding to monitoring of different age groups or different classes of policyholders). A sudden change in the longevity trend can induce serious financial consequences, and it is necessary to react as soon as data would suggest it. The specific context of COVID-19, and its potential impact on longevity trend will also be covered. Therefore, we will provide optimal surveillance strategies suited for longevity risk.
Pricing life insurance contracts with financial death and survival guarantees boils down to solving a high-dimensional pricing problem. In this project, we show that the price is the solution of a multi-dimensional partial differential equation in continuous time. By using the connection between PDE and BSDE, we solve the problem numerically by deep neural networks.
Collaborator: Lukasz Delong
Current approaches to fair valuation in insurance often follow a two-step approach, combining quadratic hedging with application of a risk measure on the residual liability, to obtain a cost-of-capital margin. In such approaches, the preferences represented by the regulatory risk measure are not reflected in the hedging process. We address this issue by an alternative two-step hedging procedure, based on generalised regression arguments, which leads to portfolios that are neutral with respect to a risk measure, such as Value-at-Risk or the expectile. First, a portfolio of traded assets aimed at replicating the liability is determined by local quadratic hedging. Second, the residual liability is hedged using an alternative objective function. The risk margin is then defined as the cost of the capital required to hedge the residual liability. In the case quantile regression is used in the second step, yearly solvency constraints are naturally satisfied; furthermore, the portfolio is a risk minimiser among all hedging portfolios that satisfy such constraints. We present a neural network algorithm for the valuation and hedging of insurance liabilities based on a backward iterations scheme. The algorithm is fairly general and easily applicable, as it only requires simulated paths of risk drivers.
Published paper on Insurance valuation: a two-step generalized regression approach
- Estimation and forecasting of popular mortality models (Lee-Carter, APC, CBD and extensions) in a full Bayesian framework.
- Model averaging techniques based on leave-future-out validation (out-of-sample validation).
- Plotting functions (ggplot).
- Functions to conduct simulation study.
- Data extraction from the Human Mortality Database.
Peer-reviewer for Insurance: Mathematics and Economics, Scandinavian Actuarial Journal, North American Actuarial Journal, ASTIN Bulletin, Annals of Actuarial Science, European Actuarial Journal, Forecasting, Acta Applicandae Mathematicae, International Journal of Environmental Research and Public Health (All verified by Publons).