Elias Nyholm: Non-linear Operators for Equivariant Neural Networks on Homogeneous Spaces
Tid: Ti 2025-11-18 kl 10.15 - 11.15
Plats: KTH 3418, Lindstedtsvägen 25 and Zoom
Videolänk: https://kth-se.zoom.us/j/65583358144?pwd=us6mdDtBgkEdZefvgbZPBWNujl3YuJ.1
Medverkande: Elias Nyholm (Chalmers)
Abstract.
The goal of this talk is to formalise equivariant neural networks that process vector features defined on homogeneous spaces. I will also point out the mathematical structures that appear along the way. While the linear case is well-established in the machine learning community (steerable/group convolutional neural networks), the non-linear setting is less developed. We fill the gap by developing a general way of representing (linear and non-linear) equivariant operators in a way that nicely generalises the linear framework. We will see the mathematical details of the linear framework and its generalisation to the non-linear regime. I will point out how specific existing neural network architectures, like convolutional and self-attention, fit into our framework. I will also outline our ongoing work on using this framework to design novel equivariant neural network models in a structured way. The talk is based on joint work (arxiv:2504.20974) with Oscar Carlsson, Maurice Weiler and Daniel Persson, as well as ongoing work with Daniel Persson.
