Reinforcement learning based QoS-provisioning over energy-harvesting 5G wireless Ad-Hoc networks

Xi Zhang, Jingqing Wang, H. Vincent Poor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

To support the delay-bounded multimedia services for 5G mobile wireless networks, the statistical quality-of-service (QoS) technique has been developed to jointly guarantee statistically delay-bounded video transmissions over different timevarying wireless channels, simultaneously. On the other hand, as one of the 5G promising candidate techniques, energy harvesting (EH) is designed to solve the energy supply problem while bringing new challenges due to the stochastic nature of the harvested energy in supporting the heterogeneous statistical delay-bounded QoS provisionings. However, due to the unknown dynamics of the distributions for energy and data arrival processes, it is challenging to design the optimal EH and resource allocation policies under the heterogeneous statistical delay-bounded QoS constraints. Towards this end, the reinforcement learning algorithms have been designed to find the optimal EH and resource allocation policies by allowing the mobile users to learn from the different network states and historical behaviors until the optimal response set is reached. To overcome the aforementioned problems, in this paper we propose the learning based algorithm for designing the optimal EH and resource allocation policies while satisfying the heterogeneous statistical delay-bounded QoS constraints over EH based 5G mobile wireless networks. In particular, we establish the EH based system model. Under the heterogeneous statistical delay-bounded QoS requirements, we formulate the effective-capacity optimization problem over EH based 5G mobile wireless networks. Then, we apply the learning based EH algorithm for deriving the optimal resource allocation policy. Also conducted is a set of simulations which validate and evaluate the system performances and show that our proposed learning based EH scheme outperforms the other existing schemes under the heterogeneous statistical delay-bounded QoS constraints over 5G mobile wireless networks.

Original languageEnglish (US)
Title of host publication2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728109626
DOIs
StatePublished - Dec 2019
Event2019 IEEE Global Communications Conference, GLOBECOM 2019 - Waikoloa, United States
Duration: Dec 9 2019Dec 13 2019

Publication series

Name2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings

Conference

Conference2019 IEEE Global Communications Conference, GLOBECOM 2019
CountryUnited States
CityWaikoloa
Period12/9/1912/13/19

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Hardware and Architecture
  • Information Systems
  • Signal Processing
  • Information Systems and Management
  • Safety, Risk, Reliability and Quality
  • Media Technology
  • Health Informatics

Keywords

  • 5G wireless ad-hoc networks
  • Effective capacity
  • Energy harvesting (EH)
  • QoS
  • Reinforcement Learning

Fingerprint Dive into the research topics of 'Reinforcement learning based QoS-provisioning over energy-harvesting 5G wireless Ad-Hoc networks'. Together they form a unique fingerprint.

  • Cite this

    Zhang, X., Wang, J., & Poor, H. V. (2019). Reinforcement learning based QoS-provisioning over energy-harvesting 5G wireless Ad-Hoc networks. In 2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings [9014116] (2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/GLOBECOM38437.2019.9014116