Description |
Join Zoom Meeting https://zoom.us/j/475819702 Meeting ID: 475-819-702 If you haven't registered for previous QLS webinars, please contact qls@ictp.it by Friday, 16 July to obtain the PASSWORD for this zoom meeting. Insight and methods coming from physics have ever since helped to improve our understanding and design of neural networks. The interdisciplinary potential of techniques borrowed from statistical physics and information theory, such as entropic regularizations and renormalization, has not yet exhausted its drive in machine learning. After a non-technical description of such a panorama, the talk delves into a specific line of research based on local entropy. Local entropy represents a refinement of standard entropy and defines a coarse-graining technique helpful in understanding and improving deep neural networks. It allows for further refinement, entailing anisotropy in synaptic space and time dependence. As such, it offers a versatile framework where to study architecture-aware regularization procedures. Besides, a scheduling protocol where the regularization is strong in the early-stage of the training and then fades progressively away constitutes an alternative to standard initialization procedures. We conclude with some future perspectives, mainly related to spontaneously/explicitly broken scale invariance and hints of spacetime geometry in deep networks. |
Local entropy, a selected topic in the physics of deep learning
Go to day