AbstractNumerical methods such as finite element have been flourishing in the past decades for modeling solid mechanics problems via solving governing partial differential equations (PDEs). A salient aspect that distinguishes these numerical methods is how they approximate the physical fields of interest. Physics-informed deep learning (PIDL) is a novel approach developed in recent years for modeling PDE solutions and shows promise to solve computational mechanics problems without using any labeled data (e.g., measurement data is unavailable). The philosophy behind it is to approximate the quantity of interest (e.g., PDE solution variables) by a deep neural network (DNN) and embed the physical law to regularize the network. To this end, training the network is equivalent to minimization of a well-designed loss function that contains the residuals of the governing PDEs as well as initial/boundary conditions (I/BCs). In this paper, we present a physics-informed neural network (PINN) with mixed-variable output to model elastodynamics problems without resort to the labeled data, in which the I/BCs are forcibly imposed. In particular, both the displacement and stress components are taken as the DNN output, inspired by the hybrid finite-element analysis, which largely improves the accuracy and the trainability of the network. Since the conventional PINN framework augments all the residual loss components in a soft manner with Lagrange multipliers, the weakly imposed I/BCs may not be well satisfied especially when complex I/BCs are present. To overcome this issue, a composite scheme of DNNs is established based on multiple single DNNs such that the I/BCs can be satisfied forcibly in a forcible manner. The proposed PINN framework is demonstrated on several numerical elasticity examples with different I/BCs, including both static and dynamic problems as well as wave propagation in truncated domains. Results show the promise of PINN in the context of computational mechanics applications.