Rudina's AI Atlas #1: Physics-Informed Neural Networks (PINNs)
?? I am kicking off the AI Atlas, insights on a technology or concept that is tangibly valuable, specific, and actionable to those interested in AI, with a consequential emerging technology allowing developers to create physically-consistent and scientifically-sound AI models, significantly improving their flexibility and applicability, Physics-Informed Neural Networks (PINNs).
?? What are Physics-Informed Neural Networks (PINNs)?
PINNs are models where known physics equations are integrated into a neural network’s learning process, dramatically boosting the AI’s ability to produce accurate results.
Purely-digital models?do not necessarily obey the fundamental laws of physical systems and thus struggle to abstract when something new happens. Conversely, the?prior-knowledge constraints of PINNs restrict models to real-world behavior, yielding more interpretable ML methods that remain robust even with incomplete data.
?? Why Physics-Informed AI Matters and Its Limitations
Digital-only AI solutions cannot generalize well beyond their training data, limiting their adaptability. For example, a traditional artificial neural network modeling and predicting the movement of a spring.
领英推荐
In contrast, the above represents a physics-informed neural network solving the same problem. Its better ability to abstract leads to greater consistency, reduced training time, and better generalization far away from the training data.
The principal obstacle facing physics-informed AI is that it is inherently complicated. Creating and solving problems based on complex physical systems requires deep mathematical domain knowledge and a good deal of computational resources.
?? Use Cases of Physics-Informed AI
The potential for PINNs to break the black box of many traditional AI models is a huge opportunity in use cases that are complex and difficult to model, or in those that require precision and accuracy for compliance. Areas that have been slow to adopt AI for these reasons include:
My sincere thanks to Vlad Sejnoha , Tyler Fohrman , Thouheed Abdul Gaffoor , and Kyle Dolce for their valuable input!
VC | Investor | Mentor | Successful Founder | Focused on AI Innovation
1 个月Sounds like an exciting start to the AI Atlas! PINNs indeed offer a fascinating approach by incorporating real-world physics into AI learning. What other emerging technologies or concepts do you think will make big waves in the AI space this year? Looking forward to more insights!
Waiting for PINNs to solve hyper local weather forecasting. Yet not there as I understand:(
Scrappy fractional CTO keeps tech projects on time and on budget.
1 年Great summary and examples, Rudina Seseri. I like to think of "ZYX-Informed Neural Networks" as a second way that ML learns. Instead of just learning by following patterns and stopping there, an Informed Neural Network adds extra learning, often in the form of constraints, into the mix. PINNs are great examples of ML trained well, as physics is a proven set of rules about how things always behave. Adding in squishy rules like a formula for how stocks behave, more of a set of logical guesses rather than rules, will result in squishy ML results. Disappointing at best. PINNs are an excellent step in putting ML into real-world use.
Fractional CxO | Adviser | Angel Investor
2 年The models discussed here are great exemplars of what researchers generically call 'adding common sense' to the models. Basic ideas like "an unsupported object falls down" are - literally - child's play for humans, but difficult for most of today's AI, leading to the difficulties in training, accuracy, and generalization Rudina Seseri discusses. Expect that "embodied" AI will come to fore to solve some of this: training with real-world physics and constraints can (but does not necessarily) imbue the model with common sense.
On the road in the U.S. and world until June 1. See you on the flip side.
2 年Dan G.