期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
A Collaborative Machine Learning Scheme for Traffic Allocation and Load Balancing for URLLC Service in 5G and Beyond
1
作者 Andreas G. Papidas George C. Polyzos 《Journal of Computer and Communications》 2023年第11期197-207,共11页
Key challenges for 5G and Beyond networks relate with the requirements for exceptionally low latency, high reliability, and extremely high data rates. The Ultra-Reliable Low Latency Communication (URLLC) use case is t... Key challenges for 5G and Beyond networks relate with the requirements for exceptionally low latency, high reliability, and extremely high data rates. The Ultra-Reliable Low Latency Communication (URLLC) use case is the trickiest to support and current research is focused on physical or MAC layer solutions, while proposals focused on the network layer using Machine Learning (ML) and Artificial Intelligence (AI) algorithms running on base stations and User Equipment (UE) or Internet of Things (IoT) devices are in early stages. In this paper, we describe the operation rationale of the most recent relevant ML algorithms and techniques, and we propose and validate ML algorithms running on both cells (base stations/gNBs) and UEs or IoT devices to handle URLLC service control. One ML algorithm runs on base stations to evaluate latency demands and offload traffic in case of need, while another lightweight algorithm runs on UEs and IoT devices to rank cells with the best URLLC service in real-time to indicate the best one cell for a UE or IoT device to camp. We show that the interplay of these algorithms leads to good service control and eventually optimal load allocation, under slow load mobility. . 展开更多
关键词 5G and B5G Networks Ultra Reliable Low Latency Communications (URLLC) Machine Learning (ML) for 5G temporal Difference methods (TDM) Monte Carlo methods Policy Gradient methods
下载PDF
Modeling and Simulation of Time Series Prediction Based on Dynamic Neural Network
2
作者 王雪松 程玉虎 彭光正 《Journal of Beijing Institute of Technology》 EI CAS 2004年第2期148-151,共4页
Molding and simulation of time series prediction based on dynamic neural network(NN) are studied. Prediction model for non-linear and time-varying system is proposed based on dynamic Jordan NN. Aiming at the intrinsic... Molding and simulation of time series prediction based on dynamic neural network(NN) are studied. Prediction model for non-linear and time-varying system is proposed based on dynamic Jordan NN. Aiming at the intrinsic defects of back-propagation (BP) algorithm that cannot update network weights incrementally, a hybrid algorithm combining the temporal difference (TD) method with BP algorithm to train Jordan NN is put forward. The proposed method is applied to predict the ash content of clean coal in jigging production real-time and multi-step. A practical example is also given and its application results indicate that the method has better performance than others and also offers a beneficial reference to the prediction of nonlinear time series. 展开更多
关键词 time series Jordan neural network(NN) back-propagation (BP) algorithm temporal difference (TD) method
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部