Outage Performance and Novel Loss Function for an ML-Assisted Resource Allocation: An Exact Analytical Framework
In this paper, we present Machine Learning (ML) solutions to address the reliability challenges likely to be encountered in advanced wireless systems (5G, 6G, and indeed beyond). Specifically, we introduce a novel loss function to minimize the outage probability of an ML-based resource allocation sy...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2024-01-01
|
| Series: | IEEE Transactions on Machine Learning in Communications and Networking |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10443669/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850051570149883904 |
|---|---|
| author | Nidhi Simmons David E. Simmons Michel Daoud Yacoub |
| author_facet | Nidhi Simmons David E. Simmons Michel Daoud Yacoub |
| author_sort | Nidhi Simmons |
| collection | DOAJ |
| description | In this paper, we present Machine Learning (ML) solutions to address the reliability challenges likely to be encountered in advanced wireless systems (5G, 6G, and indeed beyond). Specifically, we introduce a novel loss function to minimize the outage probability of an ML-based resource allocation system. A single-user multi-resource greedy allocation strategy constitutes our application scenario, for which an ML binary classification predictor assists in selecting a resource satisfying the established outage criterium. While other resource allocation policies may be suitable, they are not the focus of our study. Instead, our primary emphasis is on theoretically developing this loss function and leveraging it to train an ML model to address the outage probability challenge. With no access to future channel state information, this predictor foresees each resource’s likely future outage status. When the predictor encounters a resource it believes will be satisfactory, it allocates it to the user. The predictor aims to ensure that a user avoids resources likely to undergo an outage. Our main result establishes exact and asymptotic expressions for this system’s outage probability. These expressions reveal that focusing solely on the optimization of the per-resource outage probability conditioned on the ML predictor recommending resource allocation (a strategy that - at face value - looks to be the most appropriate) may produce inadequate predictors that reject every resource. They also reveal that focusing on standard metrics, like precision, false-positive rate, or recall, may not produce optimal predictors. With our result, we formulate a theoretically optimal, differentiable loss function to train our predictor. We then compare predictors trained using this and traditional loss functions namely, binary cross-entropy (BCE), mean squared error (MSE), and mean absolute error (MAE). In all scenarios, predictors trained using our novel loss function provide superior outage probability performance. Moreover, in some cases, our loss function outperforms predictors trained with BCE, MAE, and MSE by multiple orders of magnitude. Additionally, when applied to another ML-based resource allocation scheme (a modified greedy algorithm), our proposed loss function maintains its efficacy. |
| format | Article |
| id | doaj-art-d0fd3ccbe8d64aa194ec74570fa11cec |
| institution | DOAJ |
| issn | 2831-316X |
| language | English |
| publishDate | 2024-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Transactions on Machine Learning in Communications and Networking |
| spelling | doaj-art-d0fd3ccbe8d64aa194ec74570fa11cec2025-08-20T02:53:06ZengIEEEIEEE Transactions on Machine Learning in Communications and Networking2831-316X2024-01-01233535010.1109/TMLCN.2024.336900710443669Outage Performance and Novel Loss Function for an ML-Assisted Resource Allocation: An Exact Analytical FrameworkNidhi Simmons0https://orcid.org/0000-0002-8076-9607David E. Simmons1https://orcid.org/0009-0009-4949-6979Michel Daoud Yacoub2https://orcid.org/0000-0002-5866-5879Centre for Wireless Innovation, Institute of Electronics, Communications and Information Technology, Queen’s University Belfast, Belfast, U.KDhali Holdings Ltd., Belfast, U.KWireless Technology Laboratory, School of Electrical and Computer Engineering, University of Campinas, Campinas, BrazilIn this paper, we present Machine Learning (ML) solutions to address the reliability challenges likely to be encountered in advanced wireless systems (5G, 6G, and indeed beyond). Specifically, we introduce a novel loss function to minimize the outage probability of an ML-based resource allocation system. A single-user multi-resource greedy allocation strategy constitutes our application scenario, for which an ML binary classification predictor assists in selecting a resource satisfying the established outage criterium. While other resource allocation policies may be suitable, they are not the focus of our study. Instead, our primary emphasis is on theoretically developing this loss function and leveraging it to train an ML model to address the outage probability challenge. With no access to future channel state information, this predictor foresees each resource’s likely future outage status. When the predictor encounters a resource it believes will be satisfactory, it allocates it to the user. The predictor aims to ensure that a user avoids resources likely to undergo an outage. Our main result establishes exact and asymptotic expressions for this system’s outage probability. These expressions reveal that focusing solely on the optimization of the per-resource outage probability conditioned on the ML predictor recommending resource allocation (a strategy that - at face value - looks to be the most appropriate) may produce inadequate predictors that reject every resource. They also reveal that focusing on standard metrics, like precision, false-positive rate, or recall, may not produce optimal predictors. With our result, we formulate a theoretically optimal, differentiable loss function to train our predictor. We then compare predictors trained using this and traditional loss functions namely, binary cross-entropy (BCE), mean squared error (MSE), and mean absolute error (MAE). In all scenarios, predictors trained using our novel loss function provide superior outage probability performance. Moreover, in some cases, our loss function outperforms predictors trained with BCE, MAE, and MSE by multiple orders of magnitude. Additionally, when applied to another ML-based resource allocation scheme (a modified greedy algorithm), our proposed loss function maintains its efficacy.https://ieeexplore.ieee.org/document/10443669/Blockage predictioncustom loss functiongreedy resource allocationmachine learningnovel loss functionoptimization |
| spellingShingle | Nidhi Simmons David E. Simmons Michel Daoud Yacoub Outage Performance and Novel Loss Function for an ML-Assisted Resource Allocation: An Exact Analytical Framework IEEE Transactions on Machine Learning in Communications and Networking Blockage prediction custom loss function greedy resource allocation machine learning novel loss function optimization |
| title | Outage Performance and Novel Loss Function for an ML-Assisted Resource Allocation: An Exact Analytical Framework |
| title_full | Outage Performance and Novel Loss Function for an ML-Assisted Resource Allocation: An Exact Analytical Framework |
| title_fullStr | Outage Performance and Novel Loss Function for an ML-Assisted Resource Allocation: An Exact Analytical Framework |
| title_full_unstemmed | Outage Performance and Novel Loss Function for an ML-Assisted Resource Allocation: An Exact Analytical Framework |
| title_short | Outage Performance and Novel Loss Function for an ML-Assisted Resource Allocation: An Exact Analytical Framework |
| title_sort | outage performance and novel loss function for an ml assisted resource allocation an exact analytical framework |
| topic | Blockage prediction custom loss function greedy resource allocation machine learning novel loss function optimization |
| url | https://ieeexplore.ieee.org/document/10443669/ |
| work_keys_str_mv | AT nidhisimmons outageperformanceandnovellossfunctionforanmlassistedresourceallocationanexactanalyticalframework AT davidesimmons outageperformanceandnovellossfunctionforanmlassistedresourceallocationanexactanalyticalframework AT micheldaoudyacoub outageperformanceandnovellossfunctionforanmlassistedresourceallocationanexactanalyticalframework |