AbstractIn extreme bad weather conditions, vehicles can promptly adjust the speed and distance according to the road information, which can effectively reduce the occurrence of traffic accidents. Road type and the road adhesion coefficient are the main factors affecting road condition. The traditional identification method of the road adhesion coefficient has the characteristics of high cost and low reliability. At the same time, the road surface information recognition technology based on machine vision has become a hot spot in the current research, but the low accuracy and poor robustness are always the difficulties in the research. Based on the semantic segmentation model which is popular in machine vision, a new pavement type recognition technology is proposed by improving the model output network. According to the existing literature, vehicle road surfaces can be divided into the following nine categories: wet asphalt road, dry asphalt road, wet concrete road, dry concrete road, wet soil road, dry soil road, gravel road, compacting snow road, and icy road. The standard dataset is made by collecting road pictures through taking photos, downloading, and other ways, and the preprocessed dataset is used to train the improved semantic segmentation network model. Through many tests, training parameters that achieve the desired effect are selected for model parameter solidification, and the semantic segmentation model after solidification is used to predict the road image acquired by the camera. According to the prediction results of the model, the categories of the current driving road surface are obtained. The test results of a large number of road images show that the average classification accuracy of the nine road types is about 94%, which effectively improves the identification accuracy and robustness of the current road types. At the same time, the predicted time of a single frame image on a specific test platform is about 0.0286 s, which meets the real-time requirements.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *