### 10th World Congress in Probability and Statistics

## Contributed Session (live Q&A at Track 2, 9:30PM KST)

## Financial Mathematics and Probabilistic Modeling

### Solving the selection-recombination equation: ancestral lines and duality

Frederic Alberti (Bielefeld University)

We consider the case of an arbitrary number of neutral loci, linked to a single selected locus. In this setting, we investigate how the (random) genealogical structure of the problem can be succinctly encoded by a novel `ancestral initiation graph', and how it gives rise to a recursive integral representation of the solution with a clear, probabilistic interpretation.

References:

-F. Alberti and E. Baake, Solving the selection-recombination equation: Ancestral lines under selection and recombination, https://arxiv.org/abs/2003.06831

-F. Alberti, E. Baake and C. Herrmann, Selection, recombination, and the ancestral initiation graph, https://arxiv.org/abs/2101.10080

### Short time asymptotics for modulated rough stochastic volatility models

Barbara Pacchiarotti (Università degli studi di Roma "Tor Vergata")

### How to detect a salami slicer: a stochastic controller-stopper game with unknown competition

Kristoffer Lindensjö (Stockholm University)

### Q&A for Contributed Session 02

###### Session Chair

Hyungbin Park (Seoul National University)

## SDEs and Fractional Brownian Motions

### Weak rough-path type solutions for singular Lévy SDEs

Helena Katharina Kremp (Freie Universität Berlin)

### Functional limit theorems for approximating irregular SDEs, general diffusions and their exit times

Mikhail Urusov (University of Duisburg-Essen)

(1) A functional limit theorem (FLT) for weak approximation of the paths of arbitrary continuous Markov processes;

(2) An FLT for weak approximation of the paths and exit times.

The second FLT has a stronger conclusion but requires a stronger assumption, which is essential. We propose a new scheme, called EMCEL, which satisfies the assumption of the second FLT and thus allows to approximate every one-dimensional continuous Markov process together with its exit times. The approach is illustrated by a couple of examples with peculiar behavior, including an irregular SDE, for which the corresponding Euler scheme does not converge even weakly, a sticky Brownian motion and a Brownian motion slowed down on the Cantor set.

This is a joint work with Stefan Ankirchner and Thomas Kruse.

### Q&A for Contributed Session 07

###### Session Chair

Ildoo Kim (Korea University)

## Neural Networks and Deep Learning

### Simulated Annealing-Backpropagation Algorithm on Parallel Trained Maxout Networks (SABPMAX) in detecting credit card fraud

Sheila Mae Golingay (University of the Philippines-Diliman)

### The smoking gun: statistical theory improves neural network estimates

Sophie Langer (Technische Universität Darmstadt)

### Stochastic block model for multiple networks

Tabea Rebafka (Sorbonne Université)

### Deep neural networks for faster nonparametric regression models

Mehmet Ali Kaygusuz (The Middle East Technical University)

[1] Bauer, B and Kohler,M, “On deep learning as a remedy for the curse of dimensionality in nonparametric regression”, The Annals of Statistics, 47(4), 2019, 2261-2285.

[2] Efron,B, "Bootstrap methods: another look at the jackknife" the Annals of Statistics,7(1):1-26,1979

[3] Hamparsum Bozdogan. “Model selection and Akaike’s information criterion (AIC): The general theory and its analytical extensions”. In: Psychometrika 52.3 (1987), pp. 345–370.

[4] Sen,B, Banerjee, M and Woodroofe,M., “In-cosistency of bootstrap: The Grenander estimator ”, The Annals of Statistics,38(4),2010,1953-1977.

[5] Schmidt-Hieber, J., “Nonparametric regression using deep neural networks with ReLu activation function”, The Annals of Statistics, 48(4), 2020, 1875-1897.

### Generative model for fbm with deep ReLU neural networks

Michael Allouche (Ecole Polytechnique)

### Q&A for Contributed Session 28

###### Session Chair

Jong-June Jeon (University of Seoul)