Communication-Efficient Edge AI Inference over Wireless Networks

Release Date:2020-07-22  Author:YANG Kai, ZHOU Yong, YANG Zhanpeng, SHI Yuanming  Click:

Communication-Efficient Edge AI Inference over Wireless Networks

YANG Kai, ZHOU Yong, YANG Zhanpeng, SHI Yuanming
(School of Information Science and Technology, ShanghaiTech University, Shanghai 201210, China)

Abstract: Given the fast growth of intelligent devices, it is expected that a large number of high-stakes artificial intelligence (AI) applications, e.g., drones, autonomous cars, and tactile robots, will be deployed at the edge of wireless networks in the near future. Therefore, the intelligent communication networks will be designed to leverage advanced wireless techniques and edge computing technologies to support AI-enabled applications at various end devices with limited communication, computation, hardware and energy resources. In this article, we present the principles of efficient deployment of model inference at network edge to provide low-latency and energy-efficient AI services. This includes the wireless distributed computing framework for low-latency device distributed model inference as well as the wireless cooperative transmission strategy for energy-efficient edge cooperative model inference. The communication efficiency of edge inference systems is further improved by building up a smart radio propagation environment via intelligent reflecting surface.
Keywords: Communication efficiency; cooperative transmission; distributed computing; edge AI; edge inference

Share:

 Select Country/Language

Global - English China - 中文