Please use this identifier to cite or link to this item:
Title: Vision-based multi-agent cooperative target search
Authors: Hu, Jinwen
Xie, Lihua
Xu, Jun
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2012
Source: Hu, J., Xie, L., & Xu, J. (2012). Vision-based multi-agent cooperative target search. 2012 12th International Conference on Control Automation Robotics & Vision (ICARCV).
Abstract: This paper addresses vision-based cooperative search for multiple mobile ground targets by a group of unmanned aerial vehicles (UAVs) with limited sensing and communication capabilities. The airborne camera on each UAV has a limited field of view and its target discriminability varies as a function of altitude. First, a general target detection probability model is built based on the physical imaging process of a camera. By dividing the whole surveillance region into cells, a probability map can be formed for each UAV indicating the probability of target existence within each cell. Then, we propose a distributed probability map updating model which includes the fusion of measurement information, information sharing among neighboring agents, information decaying and transmission due to environmental changes such as the target movement. Furthermore, we formulate the target search problem by multiple agents as a cooperative coverage control problem by optimizing the collective coverage area and the detection performance. The proposed map updating model and the cooperative control scheme are distributed, i.e., assuming that each agent only communicates with its neighbors within its communication range. Finally, the effectiveness of the proposed algorithms is illustrated by simulation.
DOI: 10.1109/ICARCV.2012.6485276
Rights: © 2012 IEEE
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:EEE Conference Papers
EEE Journal Articles
SIMTech Journal Articles

Google ScholarTM



Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.