Background
Model recovery involves extracting coefficients of governing equations of a system from input output traces. The goals of model recovery include accurately reconstructing the input output traces, and reducing error in model coefficient estimation. Current methods of model recovery involve significant performance degradation on the data from real-world systems. There has been some research on improving the performance of model recovery on systems with limited data and noise, but there are still many issues that remain, including sampling at or below the Nyquist sampling rate.
Invention Description
Researchers at Arizona State University have developed a Liquid Time Constant Neural Network (LTC-NN)-based architecture designed for the extraction of governing equations for system data. This architecture aims to accurately reconstruct input-output traces and minimize error in model coefficient estimation. This technology addresses the current challenge of model recovery at or below the Nyquist sampling rate by incorporating LTC-NNs, which maintain model structural constraints and improve generalization performance under these constraints.
Potential Applications:
- Development of digital twins (e.g., predictive maintenance, process optimization, resource management)
- Safety analysis & anomaly detection in complex systems
- Advancements in explainable AI and prediction models
- Enhanced simulation accuracy for research and development
Benefits and Advantages:
- Enhanced performance – model recovery at sub-Nyquist sampling rates is improved
- Increased accuracy – incorporates external knowledge, including sparsity
- Model structural integrity – automated differentiation in LTC-NNs maintains integrity between samples
- Robust – protects against human reporting errors and implicit dynamics
Related Publication: Recovering Implicit Physics Model under Real-World Constraints