2025-12-02
Energy-conserving neural network closure model for long-time accurate and stable 2D LES
Publication
Publication
Machine learning-based closure models for LES have shown promise in capturing complex turbulence dynamics but often suffer from instabilities and physical inconsistencies. In this work, we develop a novel skew-symmetric neural architecture as closure model that enforces stability while preserving key physical conservation laws. Our approach leverages a discretization that ensures mass, momentum, and energy conservation, along with a face-averaging filter to maintain mass conservation in coarse-grained velocity fields. We compare our model against several conventional data-driven closures (including unconstrained convolutional neural networks), and the physics-based Smagorinsky model. Performance is evaluated on decaying turbulence and Kolmogorov flow for multiple coarse-graining factors. In these test cases we observe that unconstrained machine learning models suffer from numerical instabilities. In contrast, our skew-symmetric model remains stable across all tests, though at the cost of increased dissipation. Despite this trade-off, we demonstrate that our model still outperforms the Smagorinsky model in unseen scenarios. These findings highlight the potential of structure-preserving machine learning closures for reliable long-time LES.
| Additional Metadata | |
|---|---|
| , , , , | |
| doi.org/10.48550/arXiv.2504.05868 | |
| Unravelling Neural Networks with Structure-Preserving Computing | |
| Organisation | Scientific Computing |
|
van Gastelen, T., Edeling, W., & Sanderse, B. (2025). Energy-conserving neural network closure model for long-time accurate and stable 2D LES. doi:10.48550/arXiv.2504.05868 |
|