Memorization as Generalization in Physics-inspired Generative Models
Our daily experience proves that humans are able to acquire and manipulate the hidden structure of the surrounding environment to generate creative ideas and survive. Artificial machines are also able to learn the unknown distribution of a set of data-points and use it to generate new examples. This capability, known under the name of generalization, is usually opposed to learning specific point-wise examples from the training-set. This second ability is called memorization. In this talk I will report some recent results supporting the picture of generalization as a “thermal” version of memorization with respect to a fictitious learning temperature. Both biologically-inspired (i.e spin-glass like neural networks) and artificial learning systems (i.e. diffusion models) will be analyzed under the lens of statistical mechanics.