中国物理B ›› 2025, Vol. 34 ›› Issue (4): 40701-040701.doi: 10.1088/1674-1056/adacd0
Wenshu Zha(查文舒), Dongsheng Chen(陈东升)†, Daolun Li(李道伦), Luhang Shen(沈路航), and Enyuan Chen(陈恩源)
Wenshu Zha(查文舒), Dongsheng Chen(陈东升)†, Daolun Li(李道伦), Luhang Shen(沈路航), and Enyuan Chen(陈恩源)
摘要: Physics informed neural networks (PINNs) are a deep learning approach designed to solve partial differential equations (PDEs). Accurately learning the initial conditions is crucial when employing PINNs to solve PDEs. However, simply adjusting weights and imposing hard constraints may not always lead to better learning of the initial conditions; sometimes it even makes it difficult for the neural networks to converge. To enhance the accuracy of PINNs in learning the initial conditions, this paper proposes a novel strategy named causally enhanced initial conditions (CEICs). This strategy works by embedding a new loss in the loss function: the loss is constructed by the derivative of the initial condition and the derivative of the neural network at the initial condition. Furthermore, to respect the causality in learning the derivative, a novel causality coefficient is introduced for the training when selecting multiple derivatives. Additionally, because CEICs can provide more accurate pseudo-labels in the first subdomain, they are compatible with the temporal-marching strategy. Experimental results demonstrate that CEICs outperform hard constraints and improve the overall accuracy of pre-training PINNs. For the 1D-Korteweg-de Vries, reaction and convection equations, the CEIC method proposed in this paper reduces the relative error by at least 60% compared to the previous methods.
中图分类号: (Neural networks, fuzzy logic, artificial intelligence)