Article
Version 1
Preserved in Portico This version is not peer-reviewed
Multi-Stage Hough Space Calculation for Lane Mark Detection via IMU and Vision Data Fusion
Version 1
: Received: 13 April 2019 / Approved: 15 April 2019 / Online: 15 April 2019 (13:13:19 CEST)
A peer-reviewed article of this Preprint also exists.
Sun, Y.; Li, J.; Sun, Z. Multi-Stage Hough Space Calculation for Lane Markings Detection via IMU and Vision Fusion. Sensors 2019, 19, 2305. Sun, Y.; Li, J.; Sun, Z. Multi-Stage Hough Space Calculation for Lane Markings Detection via IMU and Vision Fusion. Sensors 2019, 19, 2305.
Abstract
It's challenging to achieve robust lane detection depending on single frame when considering complicated scenarios. In order to detect more credible lane markings by using sequential frames, a novel approach to fusing vision and Inertial Measurement Unit (IMU) is proposed in this paper. The hough space is employed as the space where lane markings are stored and it's calculated by three steps. Firstly, a basic hough space is extracted by Hough Transform and primary line segments are extracted from it. In order to measure the possibility about line segments belong to lane markings, a CNNs based classifier is introduced to transform the basic hough space into a probabilistic space by using the networks outputs. However, this probabilistic hough space based on single frame is easily disturbed. In the third step, a filtering process is employed to smooth the probabilistic hough space by using sequential information. Pose information provided by IMU is applied to align hough spaces extracted at different times to each other. The final hough space is used to eliminate line segments with low possibility and output those with high confidence as the result. Experiments demonstrate that the proposed approach has achieved a good performance.
Keywords
IMU; vision; classification networks; hough transform; lane markings detection
Subject
Engineering, Automotive Engineering
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment