A Fast and Accurate Lane Detection Method Based on Row Anchor and Transformer Structure.

Sensors (Basel)

School of Artificial Intelligence, Shenyang University of Technology, Shenyang 110870, China.

Published: March 2024

Lane detection plays a pivotal role in the successful implementation of Advanced Driver Assistance Systems (ADASs), which are essential for detecting the road's lane markings and determining the vehicle's position, thereby influencing subsequent decision making. However, current deep learning-based lane detection methods encounter challenges. Firstly, the on-board hardware limitations necessitate an exceptionally fast prediction speed for the lane detection method. Secondly, improvements are required for effective lane detection in complex scenarios. This paper addresses these issues by enhancing the row-anchor-based lane detection method. The Transformer encoder-decoder structure is leveraged as the row classification enhances the model's capability to extract global features and detect lane lines in intricate environments. The Feature-aligned Pyramid Network (FaPN) structure serves as an auxiliary branch, complemented by a novel structural loss with expectation loss, further refining the method's accuracy. The experimental results demonstrate our method's commendable accuracy and real-time performance, achieving a rapid prediction speed of 129 FPS (the single prediction time of the model on RTX3080 is 15.72 ms) and a 96.16% accuracy on the Tusimple dataset-a 3.32% improvement compared to the baseline method.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11014103PMC
http://dx.doi.org/10.3390/s24072116DOI Listing

Publication Analysis

Top Keywords

lane detection
24
detection method
12
lane
8
prediction speed
8
detection
6
fast accurate
4
accurate lane
4
method
4
method based
4
based row
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!