Incorporating Inductive Biases as Hard Constraints for Scientific Machine Learning

Romit Maulik, Argonne National Laboratory

In recent times, there has been great excitement about the use of surrogate modeling across various applications to bypass the traditional bottlenecks of numerical discretizations. However, naive deployments of surrogate models are limited due to the fact that they do not preserve certain desirable properties, for instance conservation of energy. This has implications for downstream tasks such as ensemble evaluations for UQ or sensitivity analyses, particularly in the case of deep learning surrogates which suffer from exceptionally poor performance during extrapolation. In this talk, we shall first discuss the development of custom neural architectures that exactly reproduce Hamiltonian mechanics. This neural architecture generates Poincare maps for toroidal magnetic fields in an accelerated manner compared to a field-line integration method. Secondly, we shall investigate the learning of stochastic processes using normalizing flows that are augmented to prevent scaling and translation in the temporal dimension. These are used to estimate solutions of non-local Fokker-Planck equations for stochastic processes with Levy noise. Subsequently, we shall discuss the future prospects of these and other scientific machine learning ideas for solving DOE-specific large-scale problems. In particular, arguments shall be made for the development of novel hybrid algorithms that can leverage widespread access to heterogeneous computing in near future (edited).

Please use this link to attend the virtual seminar:

Link: https://bluejeans.com/114722769/2329