The miniaturization of transistors led to advances in computers mainly to speed up their computation.Such miniaturization has approached its fundamental limits.However,many practices require better computational resou...The miniaturization of transistors led to advances in computers mainly to speed up their computation.Such miniaturization has approached its fundamental limits.However,many practices require better computational resources than the capabilities of existing computers.Fortunately,the development of quantum computing brings light to solve this problem.We briefly review the history of quantum computing and highlight some of its advanced achievements.Based on current studies,the Quantum Computing Advantage(QCA)seems indisputable.The challenge is how to actualize the practical quantum advantage(PQA).It is clear that machine learning can help with this task.The method used for high accuracy surface modelling(HASM)incorporates reinforced machine learning.It can be transformed into a large sparse linear system and combined with the Harrow-Hassidim-Lloyd(HHL)quantum algorithm to support quantum machine learning.HASM has been successfully used with classical computers to conduct spatial interpolation,upscaling,downscaling,data fusion and model-data assimilation of ecoenvironmental surfaces.Furthermore,a training experiment on a supercomputer indicates that our HASM-HHL quantum computing approach has a similar accuracy to classical HASM and can realize exponential acceleration over the classical algorithms.A universal platform for hybrid classical-quantum computing would be an obvious next step along with further work to improve the approach because of the many known limitations of the HHL algorithm.In addition,HASM quantum machine learning might be improved by:(1)considerably reducing the number of gates required for operating HASM-HHL;(2)evaluating cost and benchmark problems of quantum machine learning;(3)comparing the performance of the quantum and classical algorithms to clarify their advantages and disadvantages in terms of accuracy and computational speed;and(4)the algorithms would be added to a cloud platform to support applications and gather active feedback from users of the algorithms.展开更多
We propose a fundamental theorem for eco-environmental surface modelling(FTEEM) in order to apply it into the fields of ecology and environmental science more easily after the fundamental theorem for Earth’s surface ...We propose a fundamental theorem for eco-environmental surface modelling(FTEEM) in order to apply it into the fields of ecology and environmental science more easily after the fundamental theorem for Earth’s surface system modeling(FTESM). The Beijing-Tianjin-Hebei(BTH) region is taken as a case area to conduct empirical studies of algorithms for spatial upscaling, spatial downscaling, spatial interpolation, data fusion and model-data assimilation, which are based on high accuracy surface modelling(HASM), corresponding with corollaries of FTEEM. The case studies demonstrate how eco-environmental surface modelling is substantially improved when both extrinsic and intrinsic information are used along with an appropriate method of HASM. Compared with classic algorithms, the HASM-based algorithm for spatial upscaling reduced the root-meansquare error of the BTH elevation surface by 9 m. The HASM-based algorithm for spatial downscaling reduced the relative error of future scenarios of annual mean temperature by 16%. The HASM-based algorithm for spatial interpolation reduced the relative error of change trend of annual mean precipitation by 0.2%. The HASM-based algorithm for data fusion reduced the relative error of change trend of annual mean temperature by 70%. The HASM-based algorithm for model-data assimilation reduced the relative error of carbon stocks by 40%. We propose five theoretical challenges and three application problems of HASM that need to be addressed to improve FTEEM.展开更多
The computer advances of the past century can be traced to the increase in their numbers on chips that has accompanied the miniaturization of transistors.However,computers are nearing the fundamental limits of such mi...The computer advances of the past century can be traced to the increase in their numbers on chips that has accompanied the miniaturization of transistors.However,computers are nearing the fundamental limits of such miniaturization[1].Many practical problems require huge amounts of computational resources that exceed the capabilities of today's computers.A 54-qubit quantum computer on the other hand can solve in minutes a problem that would take a classical machine 10,000 years[2].展开更多
基金supported by the Open Research Program of the International Research Center of Big Data for Sustainable Development Goals(Grant No.CBAS2022ORP02)the National Natural Science Foundation of China(Grant Nos.41930647,72221002)the Key Project of Innovation LREIS(Grant No.KPI005).
文摘The miniaturization of transistors led to advances in computers mainly to speed up their computation.Such miniaturization has approached its fundamental limits.However,many practices require better computational resources than the capabilities of existing computers.Fortunately,the development of quantum computing brings light to solve this problem.We briefly review the history of quantum computing and highlight some of its advanced achievements.Based on current studies,the Quantum Computing Advantage(QCA)seems indisputable.The challenge is how to actualize the practical quantum advantage(PQA).It is clear that machine learning can help with this task.The method used for high accuracy surface modelling(HASM)incorporates reinforced machine learning.It can be transformed into a large sparse linear system and combined with the Harrow-Hassidim-Lloyd(HHL)quantum algorithm to support quantum machine learning.HASM has been successfully used with classical computers to conduct spatial interpolation,upscaling,downscaling,data fusion and model-data assimilation of ecoenvironmental surfaces.Furthermore,a training experiment on a supercomputer indicates that our HASM-HHL quantum computing approach has a similar accuracy to classical HASM and can realize exponential acceleration over the classical algorithms.A universal platform for hybrid classical-quantum computing would be an obvious next step along with further work to improve the approach because of the many known limitations of the HHL algorithm.In addition,HASM quantum machine learning might be improved by:(1)considerably reducing the number of gates required for operating HASM-HHL;(2)evaluating cost and benchmark problems of quantum machine learning;(3)comparing the performance of the quantum and classical algorithms to clarify their advantages and disadvantages in terms of accuracy and computational speed;and(4)the algorithms would be added to a cloud platform to support applications and gather active feedback from users of the algorithms.
基金supported by the National Natural Science Foundation of China (Grant Nos. 41930647, 41590844, 41421001 & 41971358)the Strategic Priority Research Program (A) of the Chinese Academy of Sciences (Grant No. XDA20030203)+1 种基金the Innovation Project of LREIS (Grant No. O88RA600YA)the Biodiversity Investigation, Observation and Assessment Program (2019–2023) of the Ministry of Ecology and Environment of China。
文摘We propose a fundamental theorem for eco-environmental surface modelling(FTEEM) in order to apply it into the fields of ecology and environmental science more easily after the fundamental theorem for Earth’s surface system modeling(FTESM). The Beijing-Tianjin-Hebei(BTH) region is taken as a case area to conduct empirical studies of algorithms for spatial upscaling, spatial downscaling, spatial interpolation, data fusion and model-data assimilation, which are based on high accuracy surface modelling(HASM), corresponding with corollaries of FTEEM. The case studies demonstrate how eco-environmental surface modelling is substantially improved when both extrinsic and intrinsic information are used along with an appropriate method of HASM. Compared with classic algorithms, the HASM-based algorithm for spatial upscaling reduced the root-meansquare error of the BTH elevation surface by 9 m. The HASM-based algorithm for spatial downscaling reduced the relative error of future scenarios of annual mean temperature by 16%. The HASM-based algorithm for spatial interpolation reduced the relative error of change trend of annual mean precipitation by 0.2%. The HASM-based algorithm for data fusion reduced the relative error of change trend of annual mean temperature by 70%. The HASM-based algorithm for model-data assimilation reduced the relative error of carbon stocks by 40%. We propose five theoretical challenges and three application problems of HASM that need to be addressed to improve FTEEM.
基金supported by the National Natural Science Foundation of China(41930647 and 62001260)the Strategic Priority Research Program(A)of the Chinese Academy of Sciences(XDA20030203)。
文摘The computer advances of the past century can be traced to the increase in their numbers on chips that has accompanied the miniaturization of transistors.However,computers are nearing the fundamental limits of such miniaturization[1].Many practical problems require huge amounts of computational resources that exceed the capabilities of today's computers.A 54-qubit quantum computer on the other hand can solve in minutes a problem that would take a classical machine 10,000 years[2].