Journal of Chongqing University of Technology(Natural Science) ›› 2023, Vol. 37 ›› Issue (7): 44-50.

• Vehicle engineering • Previous Articles     Next Articles

Research on lane depth perception of autonomous vehicles with dual attention mechanism

  

  • Online:2023-08-15 Published:2023-08-15

Abstract: In order to solve the problems caused by lane segmentation, such as heavy computation, weak fusion effect, occlusion, loss and misrecognition, this paper designs a lightweight convolutional neural network structure based on semantic segmentation to introduce the channel attention mechanism, as well as row and column attention mechanism into the network. A lightweight training network ResNet-18 is frstly used to rapidly downsample the input images to generate multi-stage feature maps. Then, the channel attention mechanism is applied to higher-order feature maps to extract higher-order semantic information. The row and column attention mechanism is applied to the low-order feature map to extract the spatial information of the lane lines. Furthermore, the feature fusion mechanism FFM is used to sample the high-order feature map and get fused with the low-order feature map to improve the segmentation accuracy of lane lines. A three-layer fully connected network is constructed to predict the categories of the segmented pixels, which replaces the traditional clustering method, classifies the background and the lane lines, and enables the whole network to get end-to-end training and output. The lightweight codec network model is trained and tested on Tusimple data set for lane detection, and later compared with previous research models. The results show that the designed deep convolutional network can still accurately and quickly identify lane lines in the case of lane lines with occlusion, blurring, shadow interference and exposure. Compared with the existing lane detection model, the segmentation accuracy and detection speed are improved, which can meet the requirements of real-time detection of automatic driving.

CLC Number: 

  • U471.1+5