摘要
With the complexity of the composition process and the rapid growth of candidate services,realizing optimal or near-optimal service composition is an urgent problem.Currently,the static service composition chain is rigid and cannot be easily adapted to the dynamic Web environment.To address these challenges,the geographic information service composition(GISC) problem as a sequential decision-making task is modeled.In addition,the Markov decision process(MDP),as a universal model for the planning problem of agents,is used to describe the GISC problem.Then,to achieve self-adaptivity and optimization in a dynamic environment,a novel approach that integrates Monte Carlo tree search(MCTS) and a temporal-difference(TD) learning algorithm is proposed.The concrete services of abstract services are determined with optimal policies and adaptive capability at runtime,based on the environment and the status of component services.The simulation experiment is performed to demonstrate the effectiveness and efficiency through learning quality and performance.
作者
Zhuang Can
Guo Mingqiang
Xie Zhong
庄灿;Guo Mingqiang;Xie Zhong(School of Geography and Information Engineering,China University of Geosciences,Wuhan 430074,P.R.China;National Engineering Research Center for GIS,Wuhan 430074,P.R.China)
基金
Supported by the National Natural Science Foundation of China(No.41971356,41671400,41701446)
National Key Research and Development Program of China(No.2017YFB0503600,2018YFB0505500)
Hubei Province Natural Science Foundation of China(No.2017CFB277)。