期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Data-Driven Discovery of Stochastic Differential Equations 被引量:1
1
作者 Yasen Wang Huazhen Fang +12 位作者 Junyang Jin Guijun Ma Xin He Xing Dai zuogong yue Cheng Cheng Hai-Tao Zhang Donglin Pu Dongrui Wu Ye Yuan Jorge Gonçalves Jürgen Kurths Han Ding 《Engineering》 SCIE EI CAS 2022年第10期244-252,共9页
Stochastic differential equations(SDEs)are mathematical models that are widely used to describe complex processes or phenomena perturbed by random noise from different sources.The identification of SDEs governing a sy... Stochastic differential equations(SDEs)are mathematical models that are widely used to describe complex processes or phenomena perturbed by random noise from different sources.The identification of SDEs governing a system is often a challenge because of the inherent strong stochasticity of data and the complexity of the system’s dynamics.The practical utility of existing parametric approaches for identifying SDEs is usually limited by insufficient data resources.This study presents a novel framework for identifying SDEs by leveraging the sparse Bayesian learning(SBL)technique to search for a parsimonious,yet physically necessary representation from the space of candidate basis functions.More importantly,we use the analytical tractability of SBL to develop an efficient way to formulate the linear regression problem for the discovery of SDEs that requires considerably less time-series data.The effectiveness of the proposed framework is demonstrated using real data on stock and oil prices,bearing variation,and wind speed,as well as simulated data on well-known stochastic dynamical systems,including the generalized Wiener process and Langevin equation.This framework aims to assist specialists in extracting stochastic mathematical models from random phenomena in the natural sciences,economics,and engineering fields for analysis,prediction,and decision making. 展开更多
关键词 Data-driven method System identification Sparse Bayesian learning Stochastic differential equations Random phenomena
下载PDF
DeceFL:a principled fully decentralized federated learning framework 被引量:2
2
作者 Ye Yuan Jun Liu +18 位作者 Dou Jin zuogong yue Tao Yang Ruijuan Chen Maolin Wang Chuan Sun Lei Xu Feng Hua Yuqi Guo Xiuchuan Tang Xin He Xinlei Yi Dong Li Guanghui Wen Wenwu Yu Hai-Tao Zhang Tianyou Chai Shaochun Sui Han Ding 《National Science Open》 2023年第1期35-51,共17页
Traditional machine learning relies on a centralized data pipeline for model training in various applications;however,data are inherently fragmented.Such a decentralized nature of databases presents the serious challe... Traditional machine learning relies on a centralized data pipeline for model training in various applications;however,data are inherently fragmented.Such a decentralized nature of databases presents the serious challenge for collaboration:sending all decentralized datasets to a central server raises serious privacy concerns.Although there has been a joint effort in tackling such a critical issue by proposing privacy-preserving machine learning frameworks,such as federated learning,most state-of-the-art frameworks are built still in a centralized way,in which a central client is needed for collecting and distributing model information(instead of data itself)from every other client,leading to high communication burden and high vulnerability when there exists a failure at or an attack on the central client.Here we propose a principled decentralized federated learning algorithm(DeceFL),which does not require a central client and relies only on local information transmission between clients and their neighbors,representing a fully decentralized learning framework.It has been further proven that every client reaches the global minimum with zero performance gap and achieves the same convergence rate O(1=T)(where T is the number of iterations in gradient descent)as centralized federated learning when the loss function is smooth and strongly convex.Finally,the proposed algorithm has been applied to a number of applications to illustrate its effectiveness for both convex and nonconvex loss functions,time-invariant and time-varying topologies,as well as IID and Non-IID of datasets,demonstrating its applicability to a wide range of real-world medical and industrial applications. 展开更多
关键词 decentralized federated learning smart manufacturing control systems privacy
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部