Neural networks trained on distilled data often produce over-confident o...
Climate change is the defining issue of our time, and we are at a defini...
Bayesian inference provides a principled framework for learning from com...
Long-tailed classification poses a challenge due to its heavy imbalance ...
Gradients have been exploited in proposal distributions to accelerate th...
Although sparse training has been successfully used in various
resource-...
Despite impressive performance on a wide variety of tasks, deep neural
n...
Sampling methods, as important inference and learning techniques, are
ty...
We propose discrete Langevin proposal (DLP), a simple and scalable
gradi...
While low-precision optimization has been widely used to accelerate deep...
Variational inference (VI) plays an essential role in approximate Bayesi...
Metropolis-Hastings (MH) is a commonly-used MCMC algorithm, but it can b...
Stochastic gradient Hamiltonian Monte Carlo (SGHMC) is an efficient meth...
Gibbs sampling is a Markov chain Monte Carlo method that is often used f...
The posteriors over neural network weights are high dimensional and
mult...