Various investigations have been conducted to analyze the water-coverage area of the Aral Sea and the Aral Sea Basin(ASB). However, the investigations incorporated considerable uncertainty and the used water indices h...Various investigations have been conducted to analyze the water-coverage area of the Aral Sea and the Aral Sea Basin(ASB). However, the investigations incorporated considerable uncertainty and the used water indices had misclassification problem, which made different research groups present different results. Thus we first ascertain the boundaries of the ASB, the Syr and Amu river basins as well as their upper, middle and lower reaches. Then a four-band index for both liquid and solid water(ILSW) is proposed to address the misclassification problems of the classic water indices. ILSW is calculated by using the reflectance values of the green, red, near infrared, and thermal infrared bands, which combines the normalized difference water index(NDWI) and land surface temperature(LST) together. Validation results show that the ILSW water index has the highest accuracy by far in the Aral Sea Basin. Our results indicate that annual average decline of the water-coverage area was 963 km^(2) in the southern Aral Sea, whereas the northern Aral Sea has experienced little change. In the meanwhile, permanent ice and snow in upper reach of ASB has retreated considerably. Annual retreating rates of the permanent ice and snow were respectively 6233and 3841 km^(2) in upper reaches of Amu river basin(UARB) and Syr river basin(USRB). One of major reasons is that climate has become warmer in ASB. The climate change has caused serious water deficit problem. The water deficit had an increasing trend since the 1990s and its increasing rates was 3.778 billion m^(3) yearly on average. The total water deficit was 76.967 billion m^(3) on average in the whole area of ASB in the 2010s. However, up reaches of Syr river basin(USRB), a component area of ASB, had water surplus of 25.461 billion m^(3). These conclusions are useful for setting out a sustainable development strategy in ASB.展开更多
The reproduction number,R,is the average number of secondary infectious cases produced by one infectious case during a disease outbreak[1].When a population is totally susceptible,R becomes the basic reproduction numb...The reproduction number,R,is the average number of secondary infectious cases produced by one infectious case during a disease outbreak[1].When a population is totally susceptible,R becomes the basic reproduction number,R_(0).It is a key parameter regulating the transmission dynamics of a pandemic[2].R_(0)provides an indication of whether the introduction of disease will result in a localized burnout or signal the beginning of a pandemic that could move through all geographic scales[3].展开更多
The miniaturization of transistors led to advances in computers mainly to speed up their computation.Such miniaturization has approached its fundamental limits.However,many practices require better computational resou...The miniaturization of transistors led to advances in computers mainly to speed up their computation.Such miniaturization has approached its fundamental limits.However,many practices require better computational resources than the capabilities of existing computers.Fortunately,the development of quantum computing brings light to solve this problem.We briefly review the history of quantum computing and highlight some of its advanced achievements.Based on current studies,the Quantum Computing Advantage(QCA)seems indisputable.The challenge is how to actualize the practical quantum advantage(PQA).It is clear that machine learning can help with this task.The method used for high accuracy surface modelling(HASM)incorporates reinforced machine learning.It can be transformed into a large sparse linear system and combined with the Harrow-Hassidim-Lloyd(HHL)quantum algorithm to support quantum machine learning.HASM has been successfully used with classical computers to conduct spatial interpolation,upscaling,downscaling,data fusion and model-data assimilation of ecoenvironmental surfaces.Furthermore,a training experiment on a supercomputer indicates that our HASM-HHL quantum computing approach has a similar accuracy to classical HASM and can realize exponential acceleration over the classical algorithms.A universal platform for hybrid classical-quantum computing would be an obvious next step along with further work to improve the approach because of the many known limitations of the HHL algorithm.In addition,HASM quantum machine learning might be improved by:(1)considerably reducing the number of gates required for operating HASM-HHL;(2)evaluating cost and benchmark problems of quantum machine learning;(3)comparing the performance of the quantum and classical algorithms to clarify their advantages and disadvantages in terms of accuracy and computational speed;and(4)the algorithms would be added to a cloud platform to support applications and gather active feedback from users of the algorithms.展开更多
We propose a fundamental theorem for eco-environmental surface modelling(FTEEM) in order to apply it into the fields of ecology and environmental science more easily after the fundamental theorem for Earth’s surface ...We propose a fundamental theorem for eco-environmental surface modelling(FTEEM) in order to apply it into the fields of ecology and environmental science more easily after the fundamental theorem for Earth’s surface system modeling(FTESM). The Beijing-Tianjin-Hebei(BTH) region is taken as a case area to conduct empirical studies of algorithms for spatial upscaling, spatial downscaling, spatial interpolation, data fusion and model-data assimilation, which are based on high accuracy surface modelling(HASM), corresponding with corollaries of FTEEM. The case studies demonstrate how eco-environmental surface modelling is substantially improved when both extrinsic and intrinsic information are used along with an appropriate method of HASM. Compared with classic algorithms, the HASM-based algorithm for spatial upscaling reduced the root-meansquare error of the BTH elevation surface by 9 m. The HASM-based algorithm for spatial downscaling reduced the relative error of future scenarios of annual mean temperature by 16%. The HASM-based algorithm for spatial interpolation reduced the relative error of change trend of annual mean precipitation by 0.2%. The HASM-based algorithm for data fusion reduced the relative error of change trend of annual mean temperature by 70%. The HASM-based algorithm for model-data assimilation reduced the relative error of carbon stocks by 40%. We propose five theoretical challenges and three application problems of HASM that need to be addressed to improve FTEEM.展开更多
Surface modeling with very large data sets is challenging. An efficient method for modeling massive data sets using the high accuracy surface modeling method(HASM) is proposed, and HASM_Big is developed to handle very...Surface modeling with very large data sets is challenging. An efficient method for modeling massive data sets using the high accuracy surface modeling method(HASM) is proposed, and HASM_Big is developed to handle very large data sets. A large data set is defined here as a large spatial domain with high resolution leading to a linear equation with matrix dimensions of hundreds of thousands. An augmented system approach is employed to solve the equality-constrained least squares problem(LSE) produced in HASM_Big, and a block row action method is applied to solve the corresponding very large matrix equations.A matrix partitioning method is used to avoid information redundancy among each block and thereby accelerate the model.Experiments including numerical tests and real-world applications are used to compare the performances of HASM_Big with its previous version, HASM. Results show that the memory storage and computing speed of HASM_Big are better than those of HASM. It is found that the computational cost of HASM_Big is linearly scalable, even with massive data sets. In conclusion,HASM_Big provides a powerful tool for surface modeling, especially when there are millions or more computing grid cells.展开更多
The computer advances of the past century can be traced to the increase in their numbers on chips that has accompanied the miniaturization of transistors.However,computers are nearing the fundamental limits of such mi...The computer advances of the past century can be traced to the increase in their numbers on chips that has accompanied the miniaturization of transistors.However,computers are nearing the fundamental limits of such miniaturization[1].Many practical problems require huge amounts of computational resources that exceed the capabilities of today's computers.A 54-qubit quantum computer on the other hand can solve in minutes a problem that would take a classical machine 10,000 years[2].展开更多
基金supported by the Key Program of National Natural Science Foundation of China(Grant No.42230708)the Strategic Priority Research Program of the Chinese Academy of Sciences,Pan-Third Pole Environment Study for a Green Silk Road(Grant No.XDA20060303)the K.C.Wong Education Foundation(Grant No.GJTD-2020-14)。
文摘Various investigations have been conducted to analyze the water-coverage area of the Aral Sea and the Aral Sea Basin(ASB). However, the investigations incorporated considerable uncertainty and the used water indices had misclassification problem, which made different research groups present different results. Thus we first ascertain the boundaries of the ASB, the Syr and Amu river basins as well as their upper, middle and lower reaches. Then a four-band index for both liquid and solid water(ILSW) is proposed to address the misclassification problems of the classic water indices. ILSW is calculated by using the reflectance values of the green, red, near infrared, and thermal infrared bands, which combines the normalized difference water index(NDWI) and land surface temperature(LST) together. Validation results show that the ILSW water index has the highest accuracy by far in the Aral Sea Basin. Our results indicate that annual average decline of the water-coverage area was 963 km^(2) in the southern Aral Sea, whereas the northern Aral Sea has experienced little change. In the meanwhile, permanent ice and snow in upper reach of ASB has retreated considerably. Annual retreating rates of the permanent ice and snow were respectively 6233and 3841 km^(2) in upper reaches of Amu river basin(UARB) and Syr river basin(USRB). One of major reasons is that climate has become warmer in ASB. The climate change has caused serious water deficit problem. The water deficit had an increasing trend since the 1990s and its increasing rates was 3.778 billion m^(3) yearly on average. The total water deficit was 76.967 billion m^(3) on average in the whole area of ASB in the 2010s. However, up reaches of Syr river basin(USRB), a component area of ASB, had water surplus of 25.461 billion m^(3). These conclusions are useful for setting out a sustainable development strategy in ASB.
基金the National Natural Science Foundation of China(41421001,41930647,and 41590844)the Strategic Priority Research Program(A)of the Chinese Academy of Sciences(XDA20030203)+1 种基金the Innovation Project of State Key Laboratory of Resources and Environment Information System(O88RA600YA)the Biodiversity Investigation,Observation and Assessment Program(2019-2023)of the Ministry of Ecology and Environment of China。
文摘The reproduction number,R,is the average number of secondary infectious cases produced by one infectious case during a disease outbreak[1].When a population is totally susceptible,R becomes the basic reproduction number,R_(0).It is a key parameter regulating the transmission dynamics of a pandemic[2].R_(0)provides an indication of whether the introduction of disease will result in a localized burnout or signal the beginning of a pandemic that could move through all geographic scales[3].
基金supported by the Open Research Program of the International Research Center of Big Data for Sustainable Development Goals(Grant No.CBAS2022ORP02)the National Natural Science Foundation of China(Grant Nos.41930647,72221002)the Key Project of Innovation LREIS(Grant No.KPI005).
文摘The miniaturization of transistors led to advances in computers mainly to speed up their computation.Such miniaturization has approached its fundamental limits.However,many practices require better computational resources than the capabilities of existing computers.Fortunately,the development of quantum computing brings light to solve this problem.We briefly review the history of quantum computing and highlight some of its advanced achievements.Based on current studies,the Quantum Computing Advantage(QCA)seems indisputable.The challenge is how to actualize the practical quantum advantage(PQA).It is clear that machine learning can help with this task.The method used for high accuracy surface modelling(HASM)incorporates reinforced machine learning.It can be transformed into a large sparse linear system and combined with the Harrow-Hassidim-Lloyd(HHL)quantum algorithm to support quantum machine learning.HASM has been successfully used with classical computers to conduct spatial interpolation,upscaling,downscaling,data fusion and model-data assimilation of ecoenvironmental surfaces.Furthermore,a training experiment on a supercomputer indicates that our HASM-HHL quantum computing approach has a similar accuracy to classical HASM and can realize exponential acceleration over the classical algorithms.A universal platform for hybrid classical-quantum computing would be an obvious next step along with further work to improve the approach because of the many known limitations of the HHL algorithm.In addition,HASM quantum machine learning might be improved by:(1)considerably reducing the number of gates required for operating HASM-HHL;(2)evaluating cost and benchmark problems of quantum machine learning;(3)comparing the performance of the quantum and classical algorithms to clarify their advantages and disadvantages in terms of accuracy and computational speed;and(4)the algorithms would be added to a cloud platform to support applications and gather active feedback from users of the algorithms.
基金supported by the National Natural Science Foundation of China (Grant Nos. 41930647, 41590844, 41421001 & 41971358)the Strategic Priority Research Program (A) of the Chinese Academy of Sciences (Grant No. XDA20030203)+1 种基金the Innovation Project of LREIS (Grant No. O88RA600YA)the Biodiversity Investigation, Observation and Assessment Program (2019–2023) of the Ministry of Ecology and Environment of China。
文摘We propose a fundamental theorem for eco-environmental surface modelling(FTEEM) in order to apply it into the fields of ecology and environmental science more easily after the fundamental theorem for Earth’s surface system modeling(FTESM). The Beijing-Tianjin-Hebei(BTH) region is taken as a case area to conduct empirical studies of algorithms for spatial upscaling, spatial downscaling, spatial interpolation, data fusion and model-data assimilation, which are based on high accuracy surface modelling(HASM), corresponding with corollaries of FTEEM. The case studies demonstrate how eco-environmental surface modelling is substantially improved when both extrinsic and intrinsic information are used along with an appropriate method of HASM. Compared with classic algorithms, the HASM-based algorithm for spatial upscaling reduced the root-meansquare error of the BTH elevation surface by 9 m. The HASM-based algorithm for spatial downscaling reduced the relative error of future scenarios of annual mean temperature by 16%. The HASM-based algorithm for spatial interpolation reduced the relative error of change trend of annual mean precipitation by 0.2%. The HASM-based algorithm for data fusion reduced the relative error of change trend of annual mean temperature by 70%. The HASM-based algorithm for model-data assimilation reduced the relative error of carbon stocks by 40%. We propose five theoretical challenges and three application problems of HASM that need to be addressed to improve FTEEM.
基金supported by the National Natural Science Foundation of China (Grant Nos. 41541010, 41701456, 41421001, 41590840 & 91425304)the Key Programs of the Chinese Academy of Sciences (Grant No. QYZDY-SSW-DQC007)the Cultivate Project of Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences (Grant No. TSYJS03)
文摘Surface modeling with very large data sets is challenging. An efficient method for modeling massive data sets using the high accuracy surface modeling method(HASM) is proposed, and HASM_Big is developed to handle very large data sets. A large data set is defined here as a large spatial domain with high resolution leading to a linear equation with matrix dimensions of hundreds of thousands. An augmented system approach is employed to solve the equality-constrained least squares problem(LSE) produced in HASM_Big, and a block row action method is applied to solve the corresponding very large matrix equations.A matrix partitioning method is used to avoid information redundancy among each block and thereby accelerate the model.Experiments including numerical tests and real-world applications are used to compare the performances of HASM_Big with its previous version, HASM. Results show that the memory storage and computing speed of HASM_Big are better than those of HASM. It is found that the computational cost of HASM_Big is linearly scalable, even with massive data sets. In conclusion,HASM_Big provides a powerful tool for surface modeling, especially when there are millions or more computing grid cells.
基金supported by the National Natural Science Foundation of China(41930647 and 62001260)the Strategic Priority Research Program(A)of the Chinese Academy of Sciences(XDA20030203)。
文摘The computer advances of the past century can be traced to the increase in their numbers on chips that has accompanied the miniaturization of transistors.However,computers are nearing the fundamental limits of such miniaturization[1].Many practical problems require huge amounts of computational resources that exceed the capabilities of today's computers.A 54-qubit quantum computer on the other hand can solve in minutes a problem that would take a classical machine 10,000 years[2].