Statistical Mechanics · Neural Networks · Interpretability
Bridging the gap between statistical physics and deep learning. We develop theoretical frameworks to understand, interpret, and explain the emergent behavior of neural networks through the lens of spin-glass theory and statistical mechanics.
Developing supervised and unsupervised Hebbian learning protocols for Hopfield networks and their generalizations, with analytical characterization via statistical mechanics.
Exploiting the replica trick to analyze the computational capabilities of neural networks, characterizing phase transitions and storage capacity in associative memories.
Statistical mechanics framework for Restricted Boltzmann Machines, investigating generative capabilities, hyper-parameter tuning, and replica-symmetry breaking phenomena.
Studying sleep-like mechanisms in artificial neural networks to optimize storage capacity and improve generalization from small datasets—toward Sustainable AI.
Investigating dense neural networks with higher-order interactions, achieving sub-threshold pattern recognition and enhanced storage scaling with network size.
Building bridges between cost functions in statistical mechanics and loss functions in machine learning, toward transparent and explainable neural architectures.
PhD · Co-Founder
Formazione: Sapienza Università di Roma
Restricted Boltzmann Machines, Hopfield networks, dreaming mechanisms, replica theory, Hebbian learning
PhD · Co-Founder
Formazione: Università del Salento
Spin glasses, dense associative neural networks, supervised learning, Guerra's interpolation techniques
arXiv:2405.03823
SSRN Preprint
Neural Networks, 2022
Physical Review Letters 124, 028301
We welcome collaborations with researchers in statistical physics, machine learning, and related fields. We are always open to discussing new ideas at the intersection of spin-glass theory and neural networks.