Machine Learning Driven Latency Optimization for Internet of Things Applications in Edge Computing

Release Date:2023-06-25 Author:Uchechukwu AWADA, ZHANG Jiankang, CHEN Sheng, LI Shuangzhi, YANG Shouyi

Abstract: Emerging Internet of Things (IoT) applications require faster execution time and response time to achieve optimal performance. However, most IoT devices have limited or no computing capability to achieve such stringent application requirements. To this end, computation offloading in edge computing has been used for IoT systems to achieve the desired performance. Nevertheless, randomly offloading applications to any available edge without considering their resource demands, inter-application dependencies and edge resource availability may eventually result in execution delay and performance degradation. We introduce Edge-IoT, a machine learning-enabled orchestration framework in this paper, which utilizes the states of edge resources and application resource requirements to facilitate a resource-aware offloading scheme for minimizing the average latency. We further propose a variant bin-packing optimization model that co-locates applications firmly on edge resources to fully utilize available resources. Extensive experiments show the effectiveness and resource efficiency of the proposed approach.

 

Keywords: edge computing; execution time; IoT; machine learning; resource efficiency

download: PDF