AbstractLoad rating is gaining popularity as a method for inspecting the structural performance of aging bridges and determining maintenance actions. Cost-effective condition-based strategies have been developed in previous studies to balance the additional costs and structural safety. However, due to the lack of constant replacement thresholds usually stipulated in governmental guidelines, they may not be suitable in practice. Furthermore, those studies neglected the preferences of decision makers, which influences the choice of optimal plans. This paper proposes a decision-making framework incorporating risk attitudes and time preference for a cost-effective load rating strategy. This strategy utilizes replacement thresholds from current guidelines and determines the time of the next load rating adaptively based on the observation results. It is formulated as a Markov decision process (MDP) compatible with discounted utility theory. Deep reinforcement learning (DRL) is employed to solve the MDP efficiently for a bridge system with large state space. Special focus is given to hyperbolic discounting, one popular type of time preference. Its inconsistency with the MDP formulation is addressed by DRL implemented with auxiliary tasks that simultaneously learns multiple Q functions. An existing multigirder bridge was used as an illustrative example. Results showed that DRL can obtain cost-efficient load rating plans tailored to preferences of decision makers.