Abstract

In future Advanced Driver Assistance Systems (ADAS), smart monitoring of the vehicle environment is a key issue. Fisheye cameras have become popular as they provide a panoramic view with a few low-cost sensors. However, current ADAS systems have limited use as most of the underlying image processing has been designed for perspective views only. In this article we illustrate how the theoretical work done in omnidirectional vision over the past ten years can help to tackle this issue. To do so, we have evaluated a simple algorithm for road line detection based on the unified sphere model in real conditions. We firstly highlight the interest of using fisheye cameras in a vehicle, then we outline our method, we present our experimental results on the detection of lines on a set of 180 images, and finally, we show how the 3D position of the lines can be recovered by triangulation.


Original document

The different versions of the original document can be found in:

http://dx.doi.org/10.1109/itsc.2013.6728376
https://trid.trb.org/view/1352548,
https://ieeexplore.ieee.org/document/6728376,
https://academic.microsoft.com/#/detail/2003370376
https://hal.archives-ouvertes.fr/hal-01710406/document,
https://hal.archives-ouvertes.fr/hal-01710406/file/Boutteau13a.pdf
Back to Top

Document information

Published on 01/01/2013

Volume 2013, 2013
DOI: 10.1109/itsc.2013.6728376
Licence: CC BY-NC-SA license

Document Score

0

Views 1
Recommendations 0

Share this document

claim authorship

Are you one of the authors of this document?