A Brief Summary of the PINNsFormer Paper
The numerical resolution of partial differential equations (PDEs) has been widely studied in science and engineering. Traditional methods, such as the finite difference method, are common but suffer from issues like high computational cost, often referred to as the "Curse of Dimensionality."
Recently, research on Physics Informed Neural Networks (PINNs) has gained prominence, aiming to develop more effective methods for solving problems involving PDEs. With the increasing use of Large Language Models (LLMs) in academia and industry, the Transformer architecture has experienced a boom in the field of Machine Learning. Originally applied to words, the concepts of embeddings have been adapted to represent physical properties in this context.
Based on this, Zhiyuan Zhao and B. Aditya Prakash from the Georgia Institute of Technology , along with Xueying Ding from 美国卡内基梅隆大学 , published an article titled: PINNsFormer: A Transformer-Based Framework for Physics-Informed Neural Networks.
Below, I provide more details about the architecture and the idea behind the model. The link to the original article can be found at the end of this post.
领英推荐
The Proposed Architecture
[x,t]→{[x,t],[x,t+Δt],…,[x,t+(k?1)Δt]}
For more details, check out the full article: Link to the paper.