DOI: 10.3390/electronics12173548 ISSN:

A Survey of Energy Optimization Approaches for Computational Task Offloading and Resource Allocation in MEC Networks

Jinming Yang, Awais Aziz Shah, Dimitrios Pezaros
  • Electrical and Electronic Engineering
  • Computer Networks and Communications
  • Hardware and Architecture
  • Signal Processing
  • Control and Systems Engineering

With the increased penetration of cloud computing and virtualization, a plethora of internet of things devices have been deployed globally. As a result, computationally intensive tasks are transmitted from the edge towards the centralized cloud for processing that leads to increased energy utilization in the cloud data centers while at the same increasing significant latency for critical applications. Recent years have witnessed a paradigm shift from centralized cloud computing towards mobile edge computing (MEC), where computational tasks are offloaded at the edge servers near user equipment (UE). This paradigm leads to lowering the energy utilization in the cloud data centers, along with low latency for UE and efficient resource utilization at the edge. In this context, the scale and complexity of the MEC networks is drastically increasing and, consequently, finding effective energy-efficient solutions for computational task offloading and resource allocation in MEC networks has become an ambitious task. To address the aforementioned challenges, this work surveys the state of the art in different categorizations of algorithm-based computational task offloading and resource allocation strategies focusing on energy utilization. It also provides a detailed cross-comparison of existing strategies in terms of their implementation specifications. Additionally, this paper also highlights open challenges and potential future research directions to facilitate efficient task offloading and resource allocation at the edge with reduced energy consumption at the centralized data centers. Our work also paves the way for the deployment of critical applications at the edge that require low latency and high service quality guarantees.

More from our Archive