At the heart of
actuarial science research
“Thought is only a flash in the middle of a long night. But this flash means everything..”
– Henri Poincaré (1854-1912)
I am currently postdoc researcher at ISFA (Institut de Science Financière et d’Assurances) in Lyon, France. My research interests are at the interplay between life insurance and quantitative finance. My current research projects include longevity modelling under model uncertainty, pricing and hedging equity-linked insurance products in incomplete markets, and data science applications for life insurance.
Prior to my post-doctoral position, I received my Ph.D degree in Business Economics from KU Leuven (September 2019). During my PhD under the supervision of Jan Dhaene, I worked on market-consistent valuation of life insurance liabilities and was mainly involved in developing new valuations methods which merge market-consistency and actuarial judgement.
StanMoMo: An R package for Bayesian Mortality Modelling with Stan
Glad to share a new R package for Bayesian Mortality Modelling. The package provides functions to fit and forecast standard mortality models (Lee-Carter, APC, CBD, etc) in a full Bayesian setting. The package also includes functions for model selection and model averaging based on leave-future-out validation.
Actuarial-consistency and two-step actuarial valuations: a new paradigm to insurance valuation
Motivated by solvency regulations, the recent focus has been towards financial risks and the so-called market-consistent valuations. In this situation, the financial market is the main driver and actuarial risks only appear as the ‘second step’. This paper goes against the tide and introduces the concept of 𝒂𝒄𝒕𝒖𝒂𝒓𝒊𝒂𝒍-𝒄𝒐𝒏𝒔𝒊𝒔𝒕𝒆𝒏𝒕 valuations where actuarial risks are at the core of the valuation. We propose a 𝒕𝒘𝒐-𝒔𝒕𝒆𝒑 𝒂𝒄𝒕𝒖𝒂𝒓𝒊𝒂𝒍 𝒗𝒂𝒍𝒖𝒂𝒕𝒊𝒐𝒏 that is first driven by actuarial information and is actuarial-consistent. We also provide a detailed comparison between actuarial-consistent and market-consistent valuations
Relying on one specific model can be too restrictive and lead to some well documented drawbacks including model misspecification, parameter uncertainty and overfitting. In this paper, we consider a Bayesian Negative-Binomial model to account for parameter uncertainty and overdispersion. Moreover, model averaging based on out-of-sample is considered as a response to model misspecifications and overfitting. Overall, we found that our approach outperforms the standard Bayesian model averaging in terms of prediction performance and robustness.
Parsimonious Predictive Mortality Modeling by Regularization and Cross-Validation with and without Covid-Type Effect
Most mortality models are very sensitive to the sample size or perturbations in the data. In this paper, we show how regularization and cross-validation can be used to smooth and forecast the mortality surface. In particular, our approach outperforms the P-spline model in terms of prediction and is much more robust when including Covid-type effect.
Insurance valuation: A two-step generalized regression approach
There is still an open debate on how to appropriately define a "fair" value and a risk margin for long-term insurance liabilities. In this paper, we discuss how solvency constraints can be efficiently included in the hedging process and the risk margin.
Pricing equity-linked insurance by neural networks
In this paper, we price a portfolio of equity-linked contracts in a general incomplete actuarial-financial market. We show that the pricing problem can be efficiently handled by the use of neural networks.