Nonlinear Manifold Learning and Model Reduction for Transonic Flows
It is aspirational to construct a nonlinear reduced-order model (ROM) with the ability to predict computational fluid dynamics (CFD) solutions accurately and efficiently. One major challenge is that the nonlinearity cannot be captured adequately by interpolation algorithm in low-dimensional space. To preserve the nonlinearity of CFD solutions for transonic flows, a new ROM is presented by integrating manifold learning into a constrained optimization, whereby a neighborhood preserving mapping is constructed by locally linear embedding (LLE) algorithm. Reconstruction errors are minimized in LLE by solving a least square problem subject to weight constraints. A loss function is proposed in the constrained optimization to preserve the geometric properties between high-dimensional space and low-dimensional manifolds. The proposed ROM is validated to predict nonlinear transonic flows over RAE 2822 airfoil and undeflected NASA Common Research Model with aspect ratio 9, in which nonlinearities are induced by shock waves. All results confirm that the ROM replicates CFD solutions accurately at fraction of the cost of CFD calculation or the full-order modeling.
The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.
Machine learning, Aerodynamics, Transonic flows
Zheng, B., Yao, W. and Xu, M. (2023) Nonlinear Manifold Learning and Model Reduction for Transonic Flows. AIAA Journal,
Institute of Engineering Sciences (IES)