## Carlo Baldassi

I graduated in Trieste in Theoretical Physics with a thesis in applications to Computer Science and large scale discrete optimization problems. I then moved to Turin where I did a PhD in Computational Neuroscience, and also got interested in Bioinformatics and inference problems. Throughout I kept an eye on theoretical aspects of Machine Learning and Neural Networks, which are my main focus lately.

My research interests focus on the application of Statistical Mechanics to machine learning and computational neuroscience, and more generally to large-scale inference and optimization problems. Lately I've been particularly interested in studying the loss landscape of neural networks, analytically and numerically.

## Unreasonable effectiveness of learning neural networks: from accessible states and robust ensembles to basic algorithmic schemes

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2016## Entropy-SGD: biasing gradient descent into wide valleys

JOURNAL OF STATISTICAL MECHANICS: THEORY AND EXPERIMENT, 2019## Role of synaptic stochasticity in training low-precision neural networks

PHYSICAL REVIEW LETTERS, 2018## Efficiency of quantum vs. classical annealing in nonconvex learning problems

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2018## Shaping the learning landscape in neural networks around wide flat minima

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2020I teach and have taught a number of programming courses, from undergraduate to PhD level, using Python and Julia.