Joint Optimization of Resource Allocation and Task Offloading Strategies in Multi-Cell Dynamic MEC Systems Using Multi-Agent DRL

This paper focuses on minimizing the total energy consumption of a long-term delay-sensitive multi-cell mobile edge computing (MEC) system that serves continuously arriving mobile devices (MDs). The energy consumption minimization is achieved by jointly optimizing the task offloading proportions, tr...

Full description

Saved in:
Bibliographic Details
Main Authors: Yuntao Hu, Ming Chen, Haowen Sun, Yinlu Wang, Yihan Cang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11059882/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper focuses on minimizing the total energy consumption of a long-term delay-sensitive multi-cell mobile edge computing (MEC) system that serves continuously arriving mobile devices (MDs). The energy consumption minimization is achieved by jointly optimizing the task offloading proportions, transmit power allocations, and computational resource distributions while ensuring the overall deadline constraints and the minimum processing size requirements in each scheduling cycle. The optimization problem is then formulated as a multi-agent Markov decision process (MAMDP) to enable sequential optimization across multiple scheduling cycles. To efficiently solve the formulated problem, we develop a multi-agent deep reinforcement learning (MADRL) algorithm that integrates the actor-critic (AC) framework, the embedding techniques, and the centralized training and decentralized execution (CTDE) framework. Simulation results show that the proposed algorithm converges 14%-23% faster than benchmark methods and significantly outperforms benchmark methods in reducing the total energy consumption under specific constraints by up to 10%.
ISSN:2169-3536