Training nonlinear parametrizations such as deep neural networks to
nume...
Energy-based models (EBMs) are generative models inspired by statistical...
We introduce a class of generative models based on the stochastic interp...
To understand the training dynamics of neural networks (NNs), prior stud...
A simple generative model based on a continuous-time normalizing flow be...
It is widely believed that the success of deep networks lies in their ab...
Many applications in computational sciences and statistical inference re...
The method of choice for integrating the time-dependent Fokker-Planck
eq...
We study the optimization of wide neural networks (NNs) via gradient flo...
Machine learning methods have been shown to give accurate predictions in...
Given a distribution of earthquake-induced seafloor elevations, we prese...
Normalizing flows can generate complex target distributions and thus sho...
Energy-based models (EBMs) are generative models that are usually traine...
Energy-based models (EBMs) are a simple yet powerful framework for gener...
Recent theoretical work has characterized the dynamics of wide shallow n...
Deep neural networks, when optimized with sufficient data, provide accur...
We propose and compare methods for the estimation of extreme event
proba...
We study the dynamics of optimization and the generalization properties ...
Neural networks with a large number of parameters admit a mean-field
des...
We investigate the theoretical foundations of the simulated tempering me...
Neural networks, a central tool in machine learning, have demonstrated
r...
A variant of the parallel tempering method is proposed in terms of a
sto...