Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/142885
Title: | Deep-reinforcement-learning-based energy-efficient resource management for social and cognitive Internet of Things | Authors: | Yang, Helin Zhong, Wen-De Chen, Chen Alphones, Arokiaswami Xie, Xianzhong |
Keywords: | Engineering::Electrical and electronic engineering | Issue Date: | 2020 | Source: | Yang, H., Zhong, W.-D., Chen, C., Alphones, A., & Xie, X. (2020). Deep-reinforcement-learning-based energy-efficient resource management for social and cognitive Internet of Things. IEEE Internet of Things Journal, 7(6), 5677-5689. doi:10.1109/JIOT.2020.2980586 | Project: | SMA-RP6 | Journal: | IEEE Internet of Things Journal | Abstract: | Internet of things (IoT) has attracted much interest due to its wide applications such as smart city, manufacturing, transportation, and healthcare. Social and cognitive IoT is capable of exploiting the social networking characteristics to optimize the network performance. Considering the fact that IoT devices have different quality of service (QoS) requirements (ranging from ultra-reliable and low-latency communications (URLLC) to minimum data rate), this paper presents a QoS-driven social-aware enhanced device-to-device (D2D) communication network model for social and cognitive IoT by utilizing social orientation information. We model the optimization problem as a multi-agent reinforcement learning formulation, and a novel coordinated multi-agent deep reinforcement learning based resource management approach is proposed to optimize the joint radio block assignment and transmission power control strategy. Meanwhile, prioritized experience replay (PER) and coordinated learning mechanisms are employed to enable communication links to work cooperatively in a distributed manner, which enhances the network performance and access success probability. Simulation results corroborate the superiority in the performance of the presented resource management approach, and it outperforms other existing approaches in terms of meeting the energy efficiency and the QoS requirements. | URI: | https://hdl.handle.net/10356/142885 | ISSN: | 2327-4662 | DOI: | 10.1109/JIOT.2020.2980586 | Schools: | School of Electrical and Electronic Engineering | Rights: | © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/JIOT.2020.2980586 | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | EEE Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Deep reinforcement learning based energy-efficient resource management for social and cognitive internet of things .pdf | 3.39 MB | Adobe PDF | ![]() View/Open |
SCOPUSTM
Citations
10
58
Updated on Mar 22, 2025
Web of ScienceTM
Citations
10
26
Updated on Oct 28, 2023
Page view(s)
290
Updated on Mar 27, 2025
Download(s) 20
363
Updated on Mar 27, 2025
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.