8–13 Aug 2022
Hörsaalzentrum Poppelsdorf
Europe/Berlin timezone

MLMC++ as a variance reduction method

11 Aug 2022, 10:00
20m
CP1-HSZ/1.004 (CP1-HSZ) - HS7 (CP1-HSZ)

CP1-HSZ/1.004 (CP1-HSZ) - HS7

CP1-HSZ

70
Show room on map
Oral Presentation Algorithms (including Machine Learning, Quantum Computing, Tensor Networks) Algorithms

Speaker

Mr Mostafa Khalil (Wuppertal Uinversity)

Description

The trace of a function $f(A)$, in our case matrix inverse $A^{-1}$, can be estimated stochastically using samples $\tau^*A^{-1}\tau$ if the components of the random vectors $\tau$ obey an appropriate probability distribution, for example when $\tau$ is an i.i.d random vector with each component taking the value $\pm 1$ at equal probability $0.5$, this is known as Hutchinson estimator. This Hutchinson Monte-Carlo sampling, however, suffers from the fact that its accuracy depends quadratically on the sample size, making higher precision estimation very expensive. Since the variance of that approach is depending roughly on $\|A^{-1} |_F$, the challenge is to reduce that variance in some way.

Recently, an enhancement of Hutchinson's method has been proposed, termed $\texttt{Hutch++}$, in which the sample space is enriched by several vectors of the form $A^{-1}x$, $x$ a random vector as in Hutchinson's method. Theoretical analyses show that under certain circumstances the number of these added sample vectors can be chosen in a way to reduce the accuracy dependence from $\mathcal{O}(n^2)$ to $\mathcal{O}(n)$.

In this talk, we combine $\texttt{Hutch++}$ with our recently suggested multigrid multilevel Monte Carlo approach. We will present results that show that the two approaches behave additively, resulting in an overall variance reduction that cannot be obtained by just one of the approaches.

Primary author

Mr Mostafa Khalil (Wuppertal Uinversity)

Co-author

Prof. Andreas Frommer (Bergische Universität Wuppertal)

Presentation materials