Department of Mathematical Sciences

Universitetsparken 5

2100 Copenhagen

Denmark

I am a PhD student of Statistics at the Copenhagen Causality Lab with Prof. Jonas Peters.

I have two bachelor degrees in Mathematics and Mathematical Economics and a master degree in Statistics, all from the University of Copenhagen. During my studies, I have spent semesters at University of Aarhus, University of Hong Kong, ETH Zürich and the Massachusetts Institute of Technology.

I am interested in Mathematical Statistics, in particular in causal inference, graphical models, machine learning, optimization and dynamical systems. Feel free to reach out!

## news

Jun 2, 2022 My research visit at the ClinicalML lab is coming to an end, and I have had a great time here! I’m very grateful to David Sontag for hosting me! Together with Michael Oberst, I have been working on an interesting project: Evaluating Robustness to Dataset Shift via Parametric Robustness Sets. We argue that considering bounded parametric shifts allow for interpreting the worst-case performance of a model, and we propose a second-order approximation of the loss under shift, which has a particularly pretty interpretation, when the shift occurs in an exponential family model. [PDF] [Code] Great news: Our paper on Invariant Ancestry Search, with Phillip Mogensen and Jonas Peters was accepted at ICML 2022 We look forward to present our work and to see many other interesting talks. New paper Identifying Causal Effects using Instrumental Time Series: Nuisance IV and Correcting for the Past online now! In this paper, we provide a structured treatment of instrumental variable (IV) methods in time series data, by showing that valid IV models can be identified by representing the time series as a finite graph. We emphasize the need for adjusting for past states of the time series, to get valid instruments, and develop Nuisance IV, a modification of IV estimators. [PDF] [Code] In our new paper Invariant Ancestry Search, we learn causal ancestors by using heterogeneity in data. We test sets for minimally invariance, and show that the union of minimaly invariant sets is ancestral. Our method retains level guarantees also when dimensionality is too large to exhaustively search all sets, and we apply our method in a genetic dataset with 6000 genes. [PDF][CODE] In our new work, Local Independence Testing in Point Processes, we develop methods for learning causal structure in complex event data, such as firing of neurons in the brain. [PDF][CODE] When dealing with policy data from several environments, some predictors may be great in some environments, but not in others. In our new paper Invariant Policy Learning, we show that learning invariant sets of predictors allows us to learn policies that generalize well to new environments. [PDF] What do you do if you observe data from a distribution Q but want to test a hypothesis about a different, but related, distribution P? In our recent paper, Statistical Testing under Distributional Shifts, we show how to test hypotheses in P by applying a test in a auxiliary dataset drawn by weighted resampling. [PDF] Predictive models are great, but models that predict well when environments change are even better! Our new work, Regularizing towards Causal Invariance, on generalization from proxy variables, with Michael Oberst, Jonas Peters and David Sontag, is now on arXiv. [PDF] Are p-values better better for detecting causal links than raw regression coefficients? Not always, we demonstrate in the paper Causal Structure Learning from Time Series, which is now on arXiv. [PDF] In October, we participated in the Causality4Climate NeurIPS challenge, and it was just announced that we won! Thanks to the organizers for hosting, and for letting us present our winning solution at NeurIPS! [Code solutions] [Slides] [University news]