Enhancing Information Freshness and Energy Efficiency in D2D Networks Through DRL-Based Scheduling and Resource Management

This paper investigates resource management in device-to-device (D2D) networks coexisting with cellular user equipment (CUEs). We introduce a novel model for joint scheduling and resource management in D2D networks, taking into account environmental constraints. To preserve information freshness, me...

Full description

Saved in:
Bibliographic Details
Main Authors: Parisa Parhizgar, Mehdi Mahdavi, Mohammad Reza Ahmadzadeh, Melike Erol-Kantarci
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Open Journal of Vehicular Technology
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10758763/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper investigates resource management in device-to-device (D2D) networks coexisting with cellular user equipment (CUEs). We introduce a novel model for joint scheduling and resource management in D2D networks, taking into account environmental constraints. To preserve information freshness, measured by minimizing the average age of information (AoI), and to effectively utilize energy harvesting (EH) technology to satisfy the network's energy needs, we formulate an online optimization problem. This formulation considers factors such as the quality of service (QoS) for both CUEs and D2Ds, available power, information freshness, and environmental sensing requirements. Due to the mixed-integer nonlinear nature and online characteristics of the problem, we propose a deep reinforcement learning (DRL) approach to solve it effectively. Numerical results show that the proposed joint scheduling and resource management strategy, utilizing the soft actor-critic (SAC) algorithm, reduces the average AoI by 20% compared to other baseline methods.
ISSN:2644-1330