Please use this identifier to cite or link to this item:
Title: Configuration-adaptive wireless visual sensing system with deep reinforcement learning
Authors: Zhou, Siyuan
Le, Duc Van
Tan, Rui
Yang, Joy Qiping
Ho, Daren
Keywords: Engineering::Computer science and engineering
Issue Date: 2023
Source: Zhou, S., Le, D. V., Tan, R., Yang, J. Q. & Ho, D. (2023). Configuration-adaptive wireless visual sensing system with deep reinforcement learning. IEEE Transactions On Mobile Computing, 22(9), 5078-5091.
Project: IAF-ICP 
Journal: IEEE Transactions on Mobile Computing 
Abstract: Visual sensing has been increasingly employed in various industrial applications including manufacturing process monitoring and worker safety monitoring. This paper presents the design and implementation of a wireless camera system, namely, EFCam, which uses low-power wireless communications and edge-fog computing to achieve cordless and energy-efficient visual sensing. The camera performs image pre-processing and offloads the data to a resourceful fog node for advanced processing using deep models. EFCam admits dynamic configurations of several parameters that form a configuration space. It aims to adapt the configuration to maintain desired visual sensing performance of the deep model at the fog node with minimum energy consumption of the camera in image capture, pre-processing, and data communications, under dynamic variations of the monitored process, the application requirement, and wireless channel conditions. However, the adaptation is challenging due to the complex relationships among the involved factors. To address the complexity, we apply deep reinforcement learning to learn the optimal adaptation policy when a fog node supports one or more wireless cameras. Extensive evaluation based on trace-driven simulations and experiments show that EFCam complies with the accuracy and latency requirements with lower energy consumption for a real industrial product object tracking application, compared with five baseline approaches incorporating hysteresis-based and event-triggered adaptation.
ISSN: 1536-1233
DOI: 10.1109/TMC.2022.3175182
Schools: School of Computer Science and Engineering 
Research Centres: HP-NTU Digital Manufacturing Corporate Lab
Rights: © 2022 IEEE. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

Citations 50

Updated on Jun 16, 2024

Page view(s)

Updated on Jun 22, 2024

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.