TY - JOUR
T1 - GDSG
T2 - Graph Diffusion-based Solution Generator for Optimization Problems in MEC Networks
AU - Liang, Ruihuai
AU - Yang, Bo
AU - Chen, Pengyu
AU - Cao, Xuelin
AU - Yu, Zhiwen
AU - Debbah, Merouane
AU - Niyato, Dusit
AU - Poor, H. Vincent
AU - Yuen, Chau
N1 - Publisher Copyright:
© 2002-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Optimization is crucial for the efficiency and reliability of multi-access edge computing (MEC) networks. Many optimization problems in this field are NP-hard and do not have effective approximation algorithms. Consequently, there is often a lack of optimal (ground-truth) data, which limits the effectiveness of traditional deep learning approaches. Most existing learning-based methods require a large amount of optimal data and do not leverage the potential advantages of using suboptimal data, which can be obtained more efficiently. To illustrate this point, we focus on the multi-server multi-user computation offloading (MSCO) problem, a common issue in MEC networks that lacks efficient optimal solution methods. In this paper, we introduce the graph diffusion-based solution generator (GDSG), designed to work with suboptimal datasets while still achieving convergence to the optimal solution with high probability. We reformulate the network optimization challenge as a distribution-learning problem and provide a clear explanation of how to learn from suboptimal training datasets. We develop GDSG, a multi-task diffusion generative model that employs a graph neural network (GNN) to capture the distribution of high-quality solutions. Our approach includes a straightforward and efficient heuristic method to generate a sufficient amount of training data composed entirely of suboptimal solutions. In our implementation, we enhance the GNN architecture to achieve better generalization. Moreover, the proposed GDSG can achieve nearly 100% task orthogonality, which helps prevent negative interference between the discrete and continuous solution generation training objectives. We demonstrate that this orthogonality arises from the diffusion-related training loss in GDSG, rather than from the GNN architecture itself. Finally, our experiments show that the proposed GDSG outperforms other benchmark methods on both optimal and suboptimal training datasets. Regarding the minimization of computation offloading costs, GDSG achieves savings of up to 56.62% on the ground-truth training set and 41.06% on the suboptimal training set compared to existing discriminative methods.
AB - Optimization is crucial for the efficiency and reliability of multi-access edge computing (MEC) networks. Many optimization problems in this field are NP-hard and do not have effective approximation algorithms. Consequently, there is often a lack of optimal (ground-truth) data, which limits the effectiveness of traditional deep learning approaches. Most existing learning-based methods require a large amount of optimal data and do not leverage the potential advantages of using suboptimal data, which can be obtained more efficiently. To illustrate this point, we focus on the multi-server multi-user computation offloading (MSCO) problem, a common issue in MEC networks that lacks efficient optimal solution methods. In this paper, we introduce the graph diffusion-based solution generator (GDSG), designed to work with suboptimal datasets while still achieving convergence to the optimal solution with high probability. We reformulate the network optimization challenge as a distribution-learning problem and provide a clear explanation of how to learn from suboptimal training datasets. We develop GDSG, a multi-task diffusion generative model that employs a graph neural network (GNN) to capture the distribution of high-quality solutions. Our approach includes a straightforward and efficient heuristic method to generate a sufficient amount of training data composed entirely of suboptimal solutions. In our implementation, we enhance the GNN architecture to achieve better generalization. Moreover, the proposed GDSG can achieve nearly 100% task orthogonality, which helps prevent negative interference between the discrete and continuous solution generation training objectives. We demonstrate that this orthogonality arises from the diffusion-related training loss in GDSG, rather than from the GNN architecture itself. Finally, our experiments show that the proposed GDSG outperforms other benchmark methods on both optimal and suboptimal training datasets. Regarding the minimization of computation offloading costs, GDSG achieves savings of up to 56.62% on the ground-truth training set and 41.06% on the suboptimal training set compared to existing discriminative methods.
KW - computation offloading
KW - generative AI
KW - graph diffusion
KW - Multi-access edge computing
KW - network optimization
UR - http://www.scopus.com/inward/record.url?scp=105005424986&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=105005424986&partnerID=8YFLogxK
U2 - 10.1109/TMC.2025.3568248
DO - 10.1109/TMC.2025.3568248
M3 - Article
AN - SCOPUS:105005424986
SN - 1536-1233
JO - IEEE Transactions on Mobile Computing
JF - IEEE Transactions on Mobile Computing
ER -