Scientific Calendar Event



Starts 13 Nov 2017 11:30
Ends 13 Nov 2017 12:30
Central European Time
ICTP, Central Area, 2nd floor, old SISSA building
Via Beirut
Stochasticity and limited precision of synaptic weights in neural network models are key aspects in both biological and hardware modeling of learning processes.
I'll introduce a neural network model with stochastic binary weights that naturally gives prominence to exponentially rare dense regions of solutions.
These solutions in dense regions have a number of desirable properties such as robustness and good generalization performance, while typical solutions are isolated and hard to find.
Binary solutions of the standard perceptron problem are obtained from a simple gradient descent procedure on a set of real values parametrizing a probability distribution over the binary synapses.
I'll present analytical and numerical results on the percepetron problem, along with algorithmic extensions aimed at training discrete deep neural networks.

References:
Baldassi, Gerace, Kappen, Lucibello, Saglietti, Tartaglione, and Zecchina.
On the role of synaptic stochasticity in training low-precision neural networks. ArXiv:1710.09825

Baldassi, Borgs, Chayes, Ingrosso, Lucibello, Saglietti, and Zecchina.
Unreasonable effectiveness of learning neural networks: From accessible
states and robust ensembles to basic algorithmic schemes. PNAS, 2016.

Baldassi, Ingrosso, Lucibello, Saglietti, and Zecchina.
Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses. PRL, 2015.