by , ,
Abstract:
Interacting particle systems have proven highly successful in various machinelearning tasks, including approximate Bayesian inference and neural network optimization. However, the analysis of thesesystems often relies on the simplifying assumption of the \emphmean-field limit, where particlenumbers approach infinity and infinitesimal step sizes are used. In practice, discrete time steps,finite particle numbers, and complex integration schemes are employed, creating a theoretical gapbetween continuous-time and discrete-time processes. In this paper, we present a novel frameworkthat establishes a precise connection between these discrete-time schemes and their correspondingmean-field limits in terms of convergence properties and asymptotic behavior. By adopting a dynamical system perspective, our framework seamlessly integrates various numerical schemes that are typically analyzed independently. For example, our framework provides a unified treatment of optimizing an infinite-width two-layer neural network and sampling via Stein Variational Gradient descent, which were previously studied in isolation.
Reference:
Stochastic Approximation Algorithms for Systems of Interacting Particles M. R. Karimi, Y. P. Hsieh, A. KrauseIn Proc. of Thirty Seventh Conference on Neural Information Processing Systems (NeurIPS), 2023
Bibtex Entry:
@inproceedings{karimi2022interactinglimit, where particlenumbers approach infinity and infinitesimal step sizes are used. In practice, discrete time steps,finite particle numbers, and complex integration schemes are employed, creating a theoretical gapbetween continuous-time and discrete-time processes. In this paper, we present a novel frameworkthat establishes a precise connection between these discrete-time schemes and their correspondingmean-field limits in terms of convergence properties and asymptotic behavior. By adopting a dynamical system perspective, our framework seamlessly integrates various numerical schemes that are typically analyzed independently. For example, our framework provides a unified treatment of optimizing an infinite-width two-layer neural network and sampling via Stein Variational Gradient descent, which were previously studied in isolation.},
	author = {Karimi, Mohammad Reza and Hsieh, Ya-Ping and Krause, Andreas},
	booktitle = {Proc. of Thirty Seventh Conference on Neural Information Processing Systems (NeurIPS)},
	month = {December},
	title = {Stochastic Approximation Algorithms for Systems of Interacting Particles},
	year = {2023}}