Please use this identifier to cite or link to this item:
|Title:||Narrowband radar system for indoor doorway detection||Authors:||Liu, Yunxiang||Keywords:||DRNTU::Engineering::Electrical and electronic engineering::Antennas, wave guides, microwaves, radar, radio
DRNTU::Engineering::Electrical and electronic engineering::Wireless communication systems
|Issue Date:||2015||Source:||Liu, Y. (2015). Narrowband radar system for indoor doorway detection. Master’s thesis, Nanyang Technological University, Singapore.||Abstract:||Radar is an object-detection system that uses radio waves to determine the range and direction of objects. It could be employed on an Unmanned Ground Vehicle (UGV) for autonomous navigation. Although the UGV could sense the surrounding environment by using optical sensors such as camera or infrared sensors, these sensors may not work well in some environments, such as an environment with low visibility or an environment with obstacles made of transparent or polished materials. In these cases, radar sensors could be employed along with these optical sensors to enhance the navigation capability of an UGV. To investigate this, a low-cost narrowband radar system based on USRP (Uni- versal Software Radio Peripheral) hardware and C++/MATLAB programming is implemented to enable UGV autonomous indoor navigation. The main purpose is to investigate open doorway detection in an indoor environment utilizing a narrowband radar, which could be used for the UGV to detect the open doorway and navigate itself autonomously through the doorway. We first present the design and calibration of a power-based narrowband radar system. It was found that, when the radar faces the wall, the relation between the received power and the distance to the wall follows the free-space path loss equation, rather than the traditional radar equation. This is due to the fact that the wall is too large to be treated as a point scatterer, and is better approximated by an infinite reflecting surface. We also present the design and validation of radar ranging. The radar ranging algorithm was verified in three conditions, 1) using cables (to avoid multipath and self-interference problems), 2) in an anechoic chamber (to avoid multipath) and 3) in a real indoor environment. The test results show that the ranging error is approximately 50cm and the maximum detection range is up to 6 meters. To detect an open doorway with an UGV, a “2-step” doorway detection proce- dure is utilized in this thesis. In the first step (“Step 1”), the nearest wall is detected by using a detection method called Shortest Range Determination (SRD). Then, in the second step (“Step 2”), an open doorway is detected by using the radar and a power-based detection algorithm to scan the space parallel to the nearest wall detected in the first step. This detection algorithm is based on total variation de-noising. It detects an open doorway by detecting the cross-range of the posi- tions of the leftmost and rightmost sides of the open doorway. The experimental results give a detection error of around 30cm. Furthermore, when the robot scans the wall in “Step 2” , the range between the wall and the radar can be measured by using radar ranging process. By averaging all the reliable ranging values, the range between the robot and the wall is estimated to provide extra information of the doorway position. The doorway position coordinate is then built up by com- bining these information including the cross-range and the range of the doorway positions. Our approach to radar-assisted indoor positioning of an UGV is thus validated. Future research on this topic may include exploration of more complex scenarios (e.g. detection of signs of life amid rubble for UGV-based search-and- rescue missions), as well as enhancing the capabilities of the USRP-based system by fusing sensing and communications.||URI:||http://hdl.handle.net/10356/65352||metadata.item.grantfulltext:||restricted||metadata.item.fulltext:||With Fulltext|
|Appears in Collections:||EEE Theses|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.