We present a meta-method for initializing (seeding) the k-means clusteri...
We systematize the approach to the investigation of deep neural network
...
We apply digitized Quantum Annealing (QA) and Quantum Approximate
Optimi...
We analyze the connection between minimizers with good generalizing
prop...
Simulated Annealing is the crowning glory of Markov Chain Monte Carlo Me...
The properties of flat minima in the empirical risk landscape of neural
...
We introduce an algorithmic decision process for multialternative choice...
The geometrical features of the (non-convex) loss landscape of neural ne...
Generative processes in biology and other fields often produce data that...
Rectified Linear Units (ReLU) have become the main model for the neural ...
Learning in Deep Neural Networks (DNN) takes place by minimizing a non-c...
We present a heuristic algorithm, called recombinator-k-means, that can
...
Stochasticity and limited precision of synaptic weights in neural networ...
We propose a new algorithm called Parle for parallel training of deep
ne...
Quantum annealers aim at solving non-convex optimization problems by
exp...
This paper proposes a new optimization algorithm called Entropy-SGD for
...
In artificial neural networks, learning from data is a computationally
d...
Learning in neural networks poses peculiar challenges when using discret...
We introduce a novel Entropy-driven Monte Carlo (EdMC) strategy to
effic...
We show that discrete synaptic weights can be efficiently used for learn...
We present an efficient learning algorithm for the problem of training n...