Physics-Inspired Neural Architectures for Forecasting Fluid and Oceanic Flows
Recent advances in deep learning have led to neural architectures effective for modeling fluid dynamics, with an emphasis on weather prediction and atmospheric modeling. In this work, we develop physics-inspired deep learning models for fluid and oceanic processes, integrating principles from physics and numerical modeling directly within the deep neural architecture to learn multi-scale features and train effectively from limited data — essential characteristics of ocean dynamics and data. Inspired by attention-based architectures, we adapt attention mechanisms based on physics and computational stencil concepts from numerical PDE solvers. Given that fluid dynamics depends on both spatial locality and temporal history, we modify attention mechanisms to capture the rich spatiotemporal dynamics of fluid flows efficiently. Our new physics-inspired attention mechanisms can handle complex bathymetry and coastal land, support learning multiscale features and multi-dynamics, and model the effects of external ocean forcing. We also investigate different choices of numerical integration schemes, error norms, and loss functions to ensure stable predictions over long temporal roll-outs.
To evaluate and validate the utility of these models, we first showcase applications to predict idealized fluid flows such as eddy shedding past obstacles, vorticity dynamics, and bottom gravity currents for varied Reynolds and Grashof numbers. We then train our deep learning architectures for realistic high-resolution data-assimilative ocean simulations and real-time sea experiments, e.g., surface velocity fields from the Loop Current System (LCS) in the Gulf of Mexico. We illustrate both ensemble and deterministic deep learning forecasts under various scenarios and in recursive and non-recursive applications. We quantify the performance of the deep learning training and forecasts using comprehensive skill metrics.