Please use this identifier to cite or link to this item:
|Title:||Fast vanishing-point detection in unstructured environments||Authors:||Moghadam, Peyman.
Starzyk, Janusz A.
Wijerupage Sardha Wijesoma.
|Keywords:||DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision||Issue Date:||2011||Source:||Moghadam, P., & Starzyk, J. A. (2011). Fast vanishing-point detection in unstructured environments. IEEE transactions on image processing, 21(1), 425-430.||Series/Report no.:||IEEE transactions on image processing||Abstract:||Vision-based road detection in unstructured environments is a challenging problem as there are hardly any discernible and invariant features that can characterize the road or its boundaries in such environments. However, a salient and consistent feature of most roads or tracks regardless of type of the environments is that their edges, boundaries, and even ruts and tire tracks left by previous vehicles on the path appear to converge into a single point known as the vanishing point. Hence, estimating this vanishing point plays a pivotal role in the determination of the direction of the road. In this paper, we propose a novel methodology based on image texture analysis for the fast estimation of the vanishing point in challenging and unstructured roads. The key attributes of the methodology consist of the optimal local dominant orientation method that uses joint activities of only four Gabor filters to precisely estimate the local dominant orientation at each pixel location in the image plane, the weighting of each pixel based on its dominant orientation, and an adaptive distance-based voting scheme for the estimation of the vanishing point. A series of quantitative and qualitative analyses are presented using natural data sets from the Defense Advanced Research Projects Agency Grand Challenge projects to demonstrate the effectiveness and the accuracy of the proposed methodology.||URI:||https://hdl.handle.net/10356/85064
|ISSN:||1057-7149||DOI:||http://dx.doi.org/10.1109/TIP.2011.2162422||Rights:||© 2011 IEEE||metadata.item.grantfulltext:||none||metadata.item.fulltext:||No Fulltext|
|Appears in Collections:||EEE Journal Articles|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.