Skip to main content

Page TitleLie-Poisson Neural Networks with C. Eldred (Sandia National Lab), F.Gay-Balmaz (CNRS and ENS, France, and S huraka (U Alberta)

Title: Lie-Poisson Neural Networks with C. Eldred (Sandia National Lab), F.Gay-Balmaz (CNRS and ENS, France, and S huraka (U Alberta)

Speaker:Vakhtang Putkaradze, University of Alberta

Date: November 3 2023
Time: 11 am in person and Hybrid
Room: LH 3058 (Lazaridis Hall, Math Boardroom)

Abstract: Physics-Informed Neural Networks (PINNs) have acquired a lot of attention in recent years due to their potential for high-performance computations for complex physical systems. The idea of PINNs is to approximate the equations, as well as boundary and initial conditions, through a loss function for a neural network. PINNs combine the efficiency of data-based prediction with the accuracy and insights provided by the physical models. Prediction of the long-term evolution of systems with little friction, such as many systems encountered in oceanography and weather, needs extra care in the development of numerical methods.
Many Hamiltonian systems with symmetry can be written as the Lie-Poisson system, where the Poisson bracket is defined by the underlying symmetry. For data-based computing of such systems, we design the Lie-Poisson neural networks (LPNets). We consider the Poisson bracket structure primary, to be satisfied exactly; whereas the Hamiltonian is only known from physics, and to be found approximately. By design, the method preserves all special integrals of the bracket (Casimirs) to machine precision and yields an efficient and promising computational method for many Lie groups, such as SO(3) (rigid body or satellite), SE(3) (Kirchhoff's equations for underwater vehicle), and others.

Bio: Vakhtang Putkaradze received his Ph.D. from the University of Copenhagen, Denmark, and held faculty positions in New Mexico, Colorado State University, and, most recently, at the University of Alberta, where he was a Centennial Professor between 2012 and 2019. From 2019 to 2022, he led the science and tech part of the Transformation Team at ATCO Ltd, first as a Senior Director and then Vice-President. He is now back at the University of Alberta, where he is currently studying applications of geometric mechanics to neural networks, in particular, efficient computations of Hamiltonian systems using data-based techniques. His main topic of interest is using geometric methods in mechanics and various applications. He has received numerous prizes and awards for research and teaching, including the Humboldt Fellowship, Senior JSPS fellowship, CAIMS-Fields industrial math prize, and G. I. Zaslavsky prize.

Link to webpage: Here
Unknown Spif - $key