期刊文献+

Communication-Efficient Edge AI Inference over Wireless Networks 被引量:2

下载PDF
导出
摘要 Given the fast growth of intelligent devices, it is expected that a large number of high-stakes artificial intelligence (AI) applications, e. g., drones, autonomous cars, and tac?tile robots, will be deployed at the edge of wireless networks in the near future. Therefore, the intelligent communication networks will be designed to leverage advanced wireless tech?niques and edge computing technologies to support AI-enabled applications at various end devices with limited communication, computation, hardware and energy resources. In this article, we present the principles of efficient deployment of model inference at network edge to provide low-latency and energy-efficient AI services. This includes the wireless distribut?ed computing framework for low-latency device distributed model inference as well as the wireless cooperative transmission strategy for energy-efficient edge cooperative model infer?ence. The communication efficiency of edge inference systems is further improved by build?ing up a smart radio propagation environment via intelligent reflecting surface.
出处 《ZTE Communications》 2020年第2期31-39,共9页 中兴通讯技术(英文版)
  • 相关文献

同被引文献15

引证文献2

二级引证文献6

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部