Please use this identifier to cite or link to this item:
|Title:||Target detection and estimation using a stereo vision camera for autonomous navigation||Authors:||Chow, Song Qian.||Keywords:||DRNTU::Engineering||Issue Date:||2009||Abstract:||Stereo based vision cameras are increasingly popular in terms of their usage in the commercial markets. They are relatively inexpensive and can perform an array of functions. Currently, there are many different types of stereo vision cameras available. Such cameras have been used to aid in the navigation of autonomous vehicles like cars and kayaks. One of these functions which this project focuses on is to detect possible targets / objects in a controlled environment and to provide rough distance and direction estimation from the autonomous vehicle to the intended object. This function is exceptionally useful in real-time control of autonomous vehicles. The focus of this project’s experiment is to capture photographic images using Point Grey Stereo Vision Camera Bumblebee® 2. After these images have been captured, software programs like FlyCapture® SDK and Triclops® SDK, which are provided by Point Grey Research will be used to process the raw images. Subsequently, edge detection techniques are applied to these images to detect these objects. This data is captured and disparity maps and point cloud can be obtained. From these processed information, the depth or distance of the objects from the camera can be estimated. The next stage is to the Algorithm Processing Unit (APU) of the autonomous vehicle can then work out collision-avoidance solution so that it can navigate around these objects smoothly. Certainly, all these processes only form the initial stages for the vehicle navigation, and in later stages, other high-level functions like target engagement will be required to fulfill the different roles. This project concludes with a summary of the existing techniques used in edge detection of objects and generation of disparity maps. Lastly, there will be some recommendations for future development in this area of study.||URI:||http://hdl.handle.net/10356/17910||Rights:||Nanyang Technological University||Fulltext Permission:||restricted||Fulltext Availability:||With Fulltext|
|Appears in Collections:||EEE Student Reports (FYP/IA/PA/PI)|
checked on Sep 30, 2020
checked on Sep 30, 2020
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.