Speaker
Description
Symmetries are of fundamental importance in all of science and therefore critical for the success of deep learning systems used in this domain.
In this talk, I will give an overview of the different forms in which symmetries appear in physics and chemistry and explain the theoretical background behind equivariant neural networks.
Then, I will discuss common ways of constructing equivariant networks in different settings and contrast manifestly equivariant networks with other techniques for reaching equivariant models.
Finally, I will report on recent results about the symmetry properties of deep ensembles trained with data augmentation.