Skip to main content
  • Home
  • Happenings
  • Events
  • PHYSICS ENHANCED MACHINE LEARNING METHODS FOR FLOW PREDICTIONS
PHYSICS ENHANCED MACHINE LEARNING METHODS FOR FLOW PREDICTIONS

PHYSICS ENHANCED MACHINE LEARNING METHODS FOR FLOW PREDICTIONS

Date31st Dec 2020

Time04:00 PM

Venue Through Google Meet: https://meet.google.com/evj-gnex-xew

PAST EVENT

Details

This thesis investigates how physics may be incorporated into numerical methods for flow prediction. While the majority of the thesis investigates efficient ways of incorporating physics into numerical methods via machine learning, we have a chapter discussing implicit large eddy simulation (ILES) as this study is also physics-driven and inspired our foray into machine learning methods.

As our first study, we investigate the capability of ILES to capture linear instabilities by simulating a planar temporal mixing layer and compare its relative performance with traditional turbulence models. To simulate a physics-based approach to transition modeling, we perform ILES by purely upwinding the advection terms of Navier-Stokes’ equation without adding additional numerical regularization. In the case of the mixing layer, it was found that upwinding performs nearly as well as sophisticated turbulence models of LES. We identify the possible reason for this to be the better capture of dispersion relations by even a naive upwinding approach.

Since these dispersion relations ultimately derive from the differential equation, and errors derive from using a local polynomial basis for a nonlinear differential equation essentially, we pursue the study of a richer basis family - neural networks – and see how well they perform for benchmark problems. The remaining bulk of this thesis concentrates on studying numerical methods using neural networks as the basis function. There has been rapid progress recently on the application of deep networks to the solution of partial differential equations (PDEs). This approach is collectively labeled as Physics Informed Neural Networks (PINNs). During our study, however, we discovered that PINNs, as available in the literature, have several problems such as speed, optimization errors, and requirement of hand-tuning.

To counteract these defects, we develop several alternatives. Firstly, we develop Physics Informed Extreme Learning Machine (PIELM), a rapid version of PINN, which can be applied to stationary and time-dependent linear partial differential equations. We demonstrate that PIELM fixes the optimization and hand-tuning issues, and matches or exceeds the accuracy of PINNs on a range of problems, while also being orders of magnitude faster.

Next, we show the limitations of neural network-based approaches, including our PIELM, in the solution of PDEs on large domains and suggest an extension, a distributed version of our algorithm - DPIELM. We show that DPIELM produces excellent results which are comparable to conventional numerical techniques in the solution of time-dependent problems.

Finally, we extend this idea of using distributed neural networks to solve nonlinear PDEs and develop a distributed version of PINN called DPINN. We demonstrate that DPINN has superior representation power and can be viewed as a generalization of the original PINN.

Collectively, this work contributes towards the numerical solution of flows in complex domains, by improving the efficiency of PINNs so that they may ultimately become a physics enhanced alternative to conventional discretization techniques.

Speakers

Mr. Vikas Dwivedi, ME15D080

Department of Mechanical Engineering