AbstractShape sensing is an emerging technique for the reconstruction of deformed shapes using data from a discrete network of strain sensors. The prominence is due to its suitability in promising applications such as structural health monitoring in multiple engineering fields and shape capturing in the medical field. In this work, a physics-informed deep learning model, named SenseNet, was developed for shape sensing applications. Unlike existing neural network approaches for shape sensing, SenseNet incorporates the knowledge of the physics of the problem, so its performance does not rely on the choices of the training data. Compared with numerical physics-based approaches, SenseNet is a mesh-free method, and therefore it offers convenience to problems with complex geometries. SenseNet is composed of two parts: a neural network to predict displacements at the given input coordinates, and a physics part to compute the loss using a function incorporated with physics information. The prior knowledge considered in the loss function includes the boundary conditions and physics relations such as the strain–displacement relation, material constitutive equation, and the governing equation obtained from the law of balance of linear momentum. SenseNet was validated with finite-element solutions for cases with nonlinear displacement fields and stress fields using bending and fixed tension tests, respectively, in both two and three dimensions. A study of the sensor density effects illustrated the fact that the accuracy of the model can be improved using a larger amount of strain data. Because general three dimensional governing equations are incorporated in the model, it was found that SenseNet is capable of reconstructing deformations in volumes with reasonable accuracy using just the surface strain data. Hence, unlike most existing models, SenseNet is not specialized for certain types of elements, and can be extended universally for even thick-body applications.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *