Abstract

Several industry, home, or automotive applications need 3D or at least range data of the observed environment to operate. Such applications are, e.g., driver assistance systems, home care systems, or 3D sensing and measurement for industrial production. State-of-the-art range sensors are laser range finders or laser scanners (LIDAR, light detection and ranging), time-of-flight (TOF) cameras, and ultrasonic sound sensors. All of them are embedded, which means that the sensors operate independently and have an integrated processing unit. This is advantageous because the processing power in the mentioned applications is limited and they are computationally intensive anyway. Another benefits of embedded systems are a low power consumption and a small form factor. Furthermore, embedded systems are full customizable by the developer and can be adapted to the specific application in an optimal way. A promising alternative to the mentioned sensors is stereo vision. Classic stereo vision uses a stereo camera setup, which is built up of two cameras (stereo camera head), mounted in parallel and separated by the baseline. It captures a synchronized stereo pair consisting of the left camera’s image and the right camera’s image. The main challenge of stereo vision is the reconstruction of 3D information of a scene captured from two different points of view. Each visible scene point is projected on the image planes of the cameras. Pixels which represent the same scene points on different image planes correspond to each other. These correspondences can then be used to determine the three dimensional position of the projected scene point in a defined coordinate system. In more detail, the horizontal displacement, called the disparity, is inverse proportional to the scene point’s depth. With this information and the camera’s intrinsic parameters (principal point and focal length), the 3D position can be reconstructed. Fig. 1 shows a typical stereo camera setup. The projections of scene point P are pl and pr. Once the correspondences are found, the disparity is calculated with


Original document

The different versions of the original document can be found in:

http://dx.doi.org/10.5772/12941
https://www.researchgate.net/profile/Florian_Eibensteiner/publication/221909976_Address-Event_Based_Stereo_Vision_with_Bio-Inspired_Silicon_Retina_Imagers/links/0c960521d0487719a1000000.pdf?inViewer=true&disableCoverPage=true&origin=publication_detail,
https://academic.microsoft.com/#/detail/1570365206
Back to Top

Document information

Published on 01/01/2011

Volume 2011, 2011
DOI: 10.5772/12941
Licence: Other

Document Score

0

Views 5
Recommendations 0

Share this document

Keywords

claim authorship

Are you one of the authors of this document?