A popular task in many areas of statistics, computational physics and data science is to generate samples from a target distribution $\pi$. This has numerous applications in various contexts, such as Bayesian inference and Machine Learning. In addition to the classic Markov Chain Monte Carlo (MCMC), new methods have recently emerged. In particular, several stochastic models, have been proposed to transform a set of particles from an initial distribution $\rho_0$ to the target $\pi$. In this context the need to make the best possible use of one’s computational resources becomes increasingly important as the complexity and size of the systems to be studied increases. For this reason, in this master project we aim to develop alternative versions of some recently developed algorithms in order to reduce their computational cost. To optimise the performance of these methods, this thesis proposes multilevel structures which, by exploiting different levels of approximation, can reduce the overall computational cost while preserving the same accuracy as single level methods. More precisely, a multilevel version of Consensus based dynamics and of Langevin dynamics are developed, togheter with a multifidelity formulation of the Stein Variational Gradient Descent (SVGD), a very popular Variational Inference algorithm.
Un compito molto diffuso in diverse aree della statistica, della fisica computazionale e della scienza dei dati è quello di generare campioni da una distribuzione target $\pi$. Questo ha numerose applicazioni in vari contesti, come l'inferenza bayesiana e il Machine Learning. Oltre al classico Markov Chain Monte Carlo (MCMC), sono emersi di recente nuovi metodi. In particolare, sono stati proposti diversi modelli stocastici per trasformare un insieme di particelle da una distribuzione iniziale $\rho_0$ a quella target $\pi$. In questo contesto, la necessità di fare il miglior uso possibile delle risorse computazionali diventa sempre più importante con l'aumentare della complessità e delle dimensioni dei sistemi da studiare. Per questo motivo, in questo progetto di master ci proponiamo di sviluppare versioni al- ternative di alcuni algoritmi recentemente sviluppati al fine di ridurne il costo computazionale. Per ottimizzare le prestazioni di questi metodi, questa tesi presenta strutture multilivello che, sfruttando diversi livelli di approssimazione, possono ridurre il costo computazionale complessivo, mantenendo la stessa accuratezza delle controparti a singolo livello. Più precisamente, una versione multilivello della dinamica consensus-based e della dinamica di Langevin insieme a una formulazione a più livelli della dinamica basata su Stein Variational Gradient Descent (SVGD), un algoritmo di inferenza variazionale molto popolare.
Multilevel particle system methods to sample target distributions
MOSSINELLI, GIACOMO
2023/2024
Abstract
A popular task in many areas of statistics, computational physics and data science is to generate samples from a target distribution $\pi$. This has numerous applications in various contexts, such as Bayesian inference and Machine Learning. In addition to the classic Markov Chain Monte Carlo (MCMC), new methods have recently emerged. In particular, several stochastic models, have been proposed to transform a set of particles from an initial distribution $\rho_0$ to the target $\pi$. In this context the need to make the best possible use of one’s computational resources becomes increasingly important as the complexity and size of the systems to be studied increases. For this reason, in this master project we aim to develop alternative versions of some recently developed algorithms in order to reduce their computational cost. To optimise the performance of these methods, this thesis proposes multilevel structures which, by exploiting different levels of approximation, can reduce the overall computational cost while preserving the same accuracy as single level methods. More precisely, a multilevel version of Consensus based dynamics and of Langevin dynamics are developed, togheter with a multifidelity formulation of the Stein Variational Gradient Descent (SVGD), a very popular Variational Inference algorithm.File | Dimensione | Formato | |
---|---|---|---|
Giacomo_Mossinelli_executive_summary.pdf
solo utenti autorizzati a partire dal 18/09/2025
Descrizione: executive summary
Dimensione
1.72 MB
Formato
Adobe PDF
|
1.72 MB | Adobe PDF | Visualizza/Apri |
Giacomo_Mossinelli_thesis_polimi.pdf
non accessibile
Descrizione: tesi
Dimensione
8.93 MB
Formato
Adobe PDF
|
8.93 MB | Adobe PDF | Visualizza/Apri |
I documenti in POLITesi sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/10589/227573