Take a Ramble into Solution Spaces for Classification Problems in Neural Networks
Contributo in Atti di convegno
Data di Pubblicazione:
2019
Abstract:
Solving a classification problem for a neural network means looking for a particular configuration of the internal parameters. This is commonly achieved by minimizing non-convex object functions. Hence, the same classification problem is likely to have several, different, equally valid solutions, depending on a number of factors like the initialization and the adopted optimizer. In this work, we propose an algorithm which looks for a zero-error path joining two solutions to the same classification problem. We witness that finding such a path is typically not a trivial problem; however, our heuristics is able to succeed in such a task. This is a step forward to explain why simple training heuristics (like SGD) are able to train complex neural networks: we speculate they focus on particular solutions, which belong to a connected solution sub-space. We work in two different scenarios: a synthetic, unbiased and totally-uncorrelated (hard) training problem, and MNIST. We empirically show that the algorithmically-accessible solutions space is connected, and we have hints suggesting it is a convex sub-space. © 2019, Springer Nature Switzerland AG.
Tipologia CRIS:
04A-Conference paper in volume
Elenco autori:
Tartaglione, Enzo; Grangetto, Marco
Link alla scheda completa:
Link al Full Text:
Titolo del libro:
International Conference on Image Analysis and Processing, ICIAP 2019
Pubblicato in: