Till innehåll på sidan

Daniel Collin: A category-theoretic analysis of backpropagation in neural networks

Tid: To 2018-09-06 kl 11.00 - 12.00

Plats: Room 32, House 5, Kräftriket, Department of Mathematics, Stockholm University  

Respondent: Daniel Collin (BSc student)

Handledare: Erik Palmgren

Exportera till kalender

Abstract: In this thesis we try to establish a compositional framework for learning algorithms based on category theory, in particular the theory of monoidal categories. By showing how to construct neural networks with string diagrams in the category Para, the category of Euclidean spaces and parametrized dierentiable functions, we gain insight in how learning algorithms can be constructed by gluing together smaller learning algorithms to form larger ones. We then analyze gradient descent and backpropagation, a combined technique common for training neural networks, through the lens of category theory in order to show how our composed learning algorithms can be trained on the category Learn, the category of sets and learning algorithms. Additionally we nd that recurrent neural networks give rise to a general construction on Learn that allows us to dene learning algorithms dened over sequences of objects.