Neuroscience and Machine Learning Seminars - Alberto Pezzotta

Seminars - Neuroscience and Machine Learning Seminars
Speakers
16:00 - 17:00
Room 3-E4-SR03 (Rontgen)
Wohlfart

[Abstract: The brain is a very noisy place: when a spike arrives at a presynaptic terminal, between 20 and 80% of the time neurotransmitters fails to release. This process appears to be random and independent across synapses. Given the energetic cost of generating and propagating action potentials, and the importance of information transmission across synapses, this seems both wasteful and inefficient. However, synaptic noise arising from variable transmission can improve, in certain restricted conditions, information transmission.

 

In Artificial Neural Networks (ANNs), noise can be beneficial during training, where it is introduced with the aim of achieving better robustness and generalisation. One of such strategies, called DropConnect, a variant of Dropout, can be thought of as an ANN implementation of synaptic failures: weights are randomly “dropped” from the network during training to prevent excessive co-adaptation, which in turn improves generalisability. In machine learning applications, this noise is removed after training; we are interested in the effect of this type of noise in brain computations, so we analyse both training and test performance in presence of failures.

In machine learning, a typical initialisation strategy for ANN weights is to sample them from a distribution with 0 mean, and standard deviation ~sqrt(N), with N being the number of input neurons. This scaling ensures the activity in the network to scale independently of N, but upon synaptic failure, this variability results in a noise which substantially corrupts the signal, degrading the performance.

However, I will show preliminary results on shallow linear networks suggesting that during learning the noise induced by failures forces weights to acquire a low-rank structure, and get smaller. I will show how these two properties, together, optimise the signal-to-noise ratio and increase performance in a feed-forward neural network.]

 

 

 

IMPORTANT: A coffee break will be offered to all participants before the beginning of the seminar.