Alessandro Ingrosso
@ai_ngrosso
Theoretical neuroscientist and spin glass enthusiast. Assistant professor @Radboud_Uni.
Back from this workshop, wonderfully organized by F. Mastrogiuseppe, @APalmigiano, @ai_ngrosso & @sebastiangoldt-thank you! Long 90-mins (chalk) talks powered some of the most meaningful scientific exchanges I've ever had. I'm hoping to further contribute to this community later!
Please find this English masters program in Neurophysics. Study the brain, artificial neural networks and complex systems through the lens of mathematical modeling and physics. Radboud University / Donders (Netherlands) is one of Europe’s leading universities in Neuroscience. 1/3
Our paper on the statistical mechanics of transfer learning is now published in PRL. Franz-Parisi meets Kernel Renormalization in this nice collaboration with friends in Bologna (@fgerace_) and Parma (P. Rodondo, @rosalbapacelli). journals.aps.org/prl/abstract/1…
Biophysics, Stat Mech and Machine Learning will meet in Trento from July 7th to 11th, 2025 in our StatPhys29 Satellite Workshop "Molecular biophysics at the transition state: from statistical mechanics to AI": indico.ectstar.eu/event/252/. Co-organized with @r_potestio lab.

New paper with @ai_ngrosso @VITAGroupUT @sebastiangoldt “On How Iterative Magnitude Pruning Discovers Local Receptive Fields in Fully Connected Neural Networks“ accepted at the conference on parsimony and learning (@CPALconf ) arxiv.org/abs/2412.06545 1/
Our paper on Wang-Landau sampling in neural networks is now published in TMLR. Here's a thread by @r_potestio.
🥳 We are pleased to announce the publication of our paper “Density of States in Neural Networks: An In-Depth Exploration of Learning in Parameter Space” in Trans. on Machine Learning Research. openreview.net/forum?id=BLDtW… @MeleMargherita_ @ai_ngrosso @UniTrento @INFN_ @DondersInst
New paper with @leonlufkin and @ermgrant! Why do we see localized receptive fields so often, even in models without sparisity regularization? We present a theory in the minimal setting from @ai_ngrosso and @sebastiangoldt
We’re excited to share our paper analyzing how data drives the emergence of localized receptive fields in neural networks! w/ @SaxeLab @ermgrant Come see our #NeurIPS2024 spotlight poster today at 4:30–7:30 in the East Hall! Paper: openreview.net/forum?id=nw9Jm…
If you missed it at the #NeurIPS2024 posters! Work led by @LeonLufkin on analytical dynamics of localization in simple neural nets, as seen in real+artificial nets and distilled by @ai_ngrosso @sebastiangoldt Leon is a fantastic collaborator, and is looking for PhD positions!
We’re excited to share our paper analyzing how data drives the emergence of localized receptive fields in neural networks! w/ @SaxeLab @ermgrant Come see our #NeurIPS2024 spotlight poster today at 4:30–7:30 in the East Hall! Paper: openreview.net/forum?id=nw9Jm…
Giving a talk on Stat Mech of Transfer Learning at 3pm EST today at the Deepmath conference in Philadelphia. Here's a link for the live stream:
The yearly tradition is already here . Watch DeepMath 2024 live here : youtube.com/watch?v=A0Oh5s…
Why does #compneuro need new learning methods? ANN models are usually trained with Gradient Descent (GD), which violates biological realities like Dale’s law and log-normal weights. Here we describe a superior learning algorithm for comp neuro: Exponentiated Gradients (EG)! 1/12
Brain-like learning with exponentiated gradients biorxiv.org/cgi/content/sh… #biorxiv_neursci
Our work on Wang-Landau in neural network learning (a.k.a. Wang-Learnau) is now on arXiv. We use enhanced sampling to explore how the entire density of states of a loss function is affected by data structure. A collaboration with friends in Trento powered by @MeleMargherita_.
📢 PREPRINT ALERT! 📢 “Density of states in neural networks: an in-depth exploration of learning in parameter space”, by M. Mele, R. Menichetti, A. Ingrosso, R. Potestio arxiv.org/abs/2409.18683 @ai_ngrosso @MeleMargherita_ @UniTrento @Radboud_Uni
This mini-review is based on Hugo Cui's PhD thesis: arxiv.org/abs/2409.13904 . My advice to him was: "Write something you would have loved to have when you started your PhD!" He did an outstanding job introducing the rich methods he developed. Enjoy and share widely!
#RockinAI #Roccella Day 4 invited speakers: Alessandro Laio, Sebastian Goldt, Pietro Rotondo, Alessandro Ingrosso
I'll be leaving ICTP in September to start as Assistant Professor at @DondersInst, @Radboud_Uni in Nijmegen. Students interested in pursuing a PhD at the border of Machine Learning, Neuroscience and Statistical Mechanics, don't hesitate to contact me.