Stochastic differential equations(SDEs)are mathematical models that are widely used to describe complex processes or phenomena perturbed by random noise from different sources.The identification of SDEs governing a sy...Stochastic differential equations(SDEs)are mathematical models that are widely used to describe complex processes or phenomena perturbed by random noise from different sources.The identification of SDEs governing a system is often a challenge because of the inherent strong stochasticity of data and the complexity of the system’s dynamics.The practical utility of existing parametric approaches for identifying SDEs is usually limited by insufficient data resources.This study presents a novel framework for identifying SDEs by leveraging the sparse Bayesian learning(SBL)technique to search for a parsimonious,yet physically necessary representation from the space of candidate basis functions.More importantly,we use the analytical tractability of SBL to develop an efficient way to formulate the linear regression problem for the discovery of SDEs that requires considerably less time-series data.The effectiveness of the proposed framework is demonstrated using real data on stock and oil prices,bearing variation,and wind speed,as well as simulated data on well-known stochastic dynamical systems,including the generalized Wiener process and Langevin equation.This framework aims to assist specialists in extracting stochastic mathematical models from random phenomena in the natural sciences,economics,and engineering fields for analysis,prediction,and decision making.展开更多
Traditional machine learning relies on a centralized data pipeline for model training in various applications;however,data are inherently fragmented.Such a decentralized nature of databases presents the serious challe...Traditional machine learning relies on a centralized data pipeline for model training in various applications;however,data are inherently fragmented.Such a decentralized nature of databases presents the serious challenge for collaboration:sending all decentralized datasets to a central server raises serious privacy concerns.Although there has been a joint effort in tackling such a critical issue by proposing privacy-preserving machine learning frameworks,such as federated learning,most state-of-the-art frameworks are built still in a centralized way,in which a central client is needed for collecting and distributing model information(instead of data itself)from every other client,leading to high communication burden and high vulnerability when there exists a failure at or an attack on the central client.Here we propose a principled decentralized federated learning algorithm(DeceFL),which does not require a central client and relies only on local information transmission between clients and their neighbors,representing a fully decentralized learning framework.It has been further proven that every client reaches the global minimum with zero performance gap and achieves the same convergence rate O(1=T)(where T is the number of iterations in gradient descent)as centralized federated learning when the loss function is smooth and strongly convex.Finally,the proposed algorithm has been applied to a number of applications to illustrate its effectiveness for both convex and nonconvex loss functions,time-invariant and time-varying topologies,as well as IID and Non-IID of datasets,demonstrating its applicability to a wide range of real-world medical and industrial applications.展开更多
基金supported by the National Key Research and Development Program of China(2018YFB1701202)the National Natural Science Foundation of China(92167201 and 51975237)the Fundamental Research Funds for the Central Universities,Huazhong University of Science and Technology(2021JYCXJJ028)。
文摘Stochastic differential equations(SDEs)are mathematical models that are widely used to describe complex processes or phenomena perturbed by random noise from different sources.The identification of SDEs governing a system is often a challenge because of the inherent strong stochasticity of data and the complexity of the system’s dynamics.The practical utility of existing parametric approaches for identifying SDEs is usually limited by insufficient data resources.This study presents a novel framework for identifying SDEs by leveraging the sparse Bayesian learning(SBL)technique to search for a parsimonious,yet physically necessary representation from the space of candidate basis functions.More importantly,we use the analytical tractability of SBL to develop an efficient way to formulate the linear regression problem for the discovery of SDEs that requires considerably less time-series data.The effectiveness of the proposed framework is demonstrated using real data on stock and oil prices,bearing variation,and wind speed,as well as simulated data on well-known stochastic dynamical systems,including the generalized Wiener process and Langevin equation.This framework aims to assist specialists in extracting stochastic mathematical models from random phenomena in the natural sciences,economics,and engineering fields for analysis,prediction,and decision making.
基金supported by the National Natural Science Foundation of China(Grant Nos.92167201,52188102,62133003,61991403,61991404,and 61991400)Jiangsu Industrial Technology Research Institute(JITRI).
文摘Traditional machine learning relies on a centralized data pipeline for model training in various applications;however,data are inherently fragmented.Such a decentralized nature of databases presents the serious challenge for collaboration:sending all decentralized datasets to a central server raises serious privacy concerns.Although there has been a joint effort in tackling such a critical issue by proposing privacy-preserving machine learning frameworks,such as federated learning,most state-of-the-art frameworks are built still in a centralized way,in which a central client is needed for collecting and distributing model information(instead of data itself)from every other client,leading to high communication burden and high vulnerability when there exists a failure at or an attack on the central client.Here we propose a principled decentralized federated learning algorithm(DeceFL),which does not require a central client and relies only on local information transmission between clients and their neighbors,representing a fully decentralized learning framework.It has been further proven that every client reaches the global minimum with zero performance gap and achieves the same convergence rate O(1=T)(where T is the number of iterations in gradient descent)as centralized federated learning when the loss function is smooth and strongly convex.Finally,the proposed algorithm has been applied to a number of applications to illustrate its effectiveness for both convex and nonconvex loss functions,time-invariant and time-varying topologies,as well as IID and Non-IID of datasets,demonstrating its applicability to a wide range of real-world medical and industrial applications.