By using the data in 169 sounding stations over the world,NCEP/NCAR reanalysis data were tested,and the distribution characteristics of standard errors of geopotential height,temperature and wind speed field from the ...By using the data in 169 sounding stations over the world,NCEP/NCAR reanalysis data were tested,and the distribution characteristics of standard errors of geopotential height,temperature and wind speed field from the upper troposphere to the lower stratosphere over the world(most were the land zone) were analyzed.The results showed that the standard error distribution of reanalysis wind speed field data was mainly affected by the jet stream zone.There existed the obvious difference between the jet stream zone and the actual wind field.The distribution of standard error in the wind speed field had the obvious seasonal difference in winter,summer,and the average deviation was larger near the coastline.The high value zones of standard errors of reanalysis geopotential height and temperature field mainly concentrated in the low-latitude region in the Eastern Hemisphere(Indian Ocean coast).The distribution of standard error was basically consistent with average error.Therefore,the standard error could be explained well by the average error.The standard errors of reanalysis temperature and geopotential height data in the inland zone were lower.The high value zone mainly distributed along the coastline,and the average error of wind speed field was bigger near the coastline.It closely related to the quality of data in the sounding stations,the regional difference and the fact that the land observation stations were dense,and the ocean observation stations were fewer.展开更多
Exclusion from the mainstream financial world is a burden on the poor of many countries.The proliferation of new mobile and online financial services,such as e-banking,money transfers,and payment processing has the po...Exclusion from the mainstream financial world is a burden on the poor of many countries.The proliferation of new mobile and online financial services,such as e-banking,money transfers,and payment processing has the potential to provide access to basic financial products and services to financially excluded people.The purpose of this study was to investigate the effects of the growth of mobile phone and Internet use on financial inclusion in the South Asian Association for Regional Cooperation(SAARC)countries from 2004 to 2014.We applied principal component analysis to construct a financial inclusion index that served as a proxy variable for the accessibility of financial services in the SAARC countries.Using three different models-the fixed effect,random effect,and panel correction standard errors modelsthis study discovered a positive and significant relationship between the growth of financial inclusion and expansion of both mobile phone and Internet services.Moreover,an empirical study of the control variables showed that the levels of income and education were positively associated with financial inclusion,whereas the size of the rural population and unemployment were negatively related to financial inclusion.In addition,the empirical estimates posit a unidirectional causal flow from the growth of mobile and Internet services to expanded financial inclusion in the SAARC countries.展开更多
Spatial heterogeneity of fuel moisture content determines the spread rate and direction of a forest fire.Research on the spatial heterogeneity of the moisture content of dead fuel of Larix gmelinii Rupr.showed that:(1...Spatial heterogeneity of fuel moisture content determines the spread rate and direction of a forest fire.Research on the spatial heterogeneity of the moisture content of dead fuel of Larix gmelinii Rupr.showed that:(1)fuel moisture content in litter layer<semi-humus layer<humus layer,and the coefficient of variation decreased with sampling depth;(2)the sill value of the semi-humus layer was highest,the humus layer moderate,the litter layer the smallest,overall,the spatial heterogeneity of the semi-humus layer was the highest.The humus layer in the slant direction and three layers in a vertical direction showed strong spatial correlation with the lowest nugget coefficient of 0.0968;(3)the fuel moisture content of the humus layer showed strong spatial anisotropy;and,(4)estimating the total moisture content of the sampling site by stimulated sampling reasonable control of the sampling interval,and increasing the sampling intensity can reduce the error.When the sampling intensity is increased to more than 16 and the sampling interval 3 m,the standard error is<15%.The spatial heterogeneity of fuel moisture content is best revealed by increasing sampling density,sampling in different fire seasons,and in different slope directions and positions.The results can provide a scientific basis for forest fire prediction and prevention.展开更多
In recent years there has been an increasing interest in developing spatial statistical models for data sets that are seemingly spatially independent.This lack of spatial structure makes it difficult,if not impossible...In recent years there has been an increasing interest in developing spatial statistical models for data sets that are seemingly spatially independent.This lack of spatial structure makes it difficult,if not impossible to use optimal predictors such as ordinary kriging for modeling the spatial variability in the data.In many instances,the data still contain a wealth of information that could be used to gain flexibility and precision in estimation.In this paper we propose using a combination of regression analysis to describe the large-scale spatial variability in a set of survey data and a tree-based stratification design to enhance the estimation process of the small-scale spatial variability.With this approach,sample units(i.e.,pixel of a satellite image) are classified with respect to predictions of error attributes into homogeneous classes,and the classes are then used as strata in the stratified analysis.Independent variables used as a basis of stratification included terrain data and satellite imagery.A decision rule was used to identify a tree size that minimized the error in estimating the variance of the mean response and prediction uncertainties at new spatial locations.This approach was applied to a set of n=937 forested plots from a state-wide inventory conducted in 2006 in the Mexican State of Jalisco.The final models accounted for 62% to 82% of the variability observed in canopy closure(%),basal area(m2·ha-1),cubic volumes(m3·ha-1) and biomass(t·ha-1) on the sample plots.The spatial models provided unbiased estimates and when averaged over all sample units in the population,estimates of forest structure were very close to those obtained using classical estimates based on the sampling strategy used in the state-wide inventory.The spatial models also provided unbiased estimates of model variances leading to confidence and prediction coverage rates close to the 0.95 nominal rate.展开更多
We contrast a new continuous approach(CA)for estimating plot-level above-ground biomass(AGB)in forest inventories with the current approach of estimating AGB exclusively from the tree-level AGB predicted for each tree...We contrast a new continuous approach(CA)for estimating plot-level above-ground biomass(AGB)in forest inventories with the current approach of estimating AGB exclusively from the tree-level AGB predicted for each tree in a plot,henceforth called DA(discrete approach).With the CA,the AGB in a forest is modelled as a continuous surface and the AGB estimate for a fixed-area plot is computed as the integral of the AGB surface taken over the plot area.Hence with the CA,the portion of the biomass of in-plot trees that extends across the plot perimeter is ignored while the biomass from trees outside of the plot reaching inside the plot is added.We use a sampling simulation with data from a fully mapped two hectare area to illustrate that important differences in plot-level AGB estimates can emerge.Ideally CA-based estimates of mean AGB should be less variable than those derived from the DA.If realized,this difference translates to a higher precision from field sampling,or a lower required sample size.In our case study with a target precision of 5%(i.e.relative standard error of the estimated mean AGB),the CA required a 27.1%lower sample size for small plots of 100 m2 and a 10.4%lower sample size for larger plots of 1700 m2.We examined sampling induced errors only and did not yet consider model errors.We discuss practical issues in implementing the CA in field inventories and the potential in applications that model biomass with remote sensing data.The CA is a variation on a plot design for above-ground forest biomass;as such it can be applied in combination with any forest inventory sampling design.展开更多
It is now recognized that many geomaterials have nonlinear failure envelopes. This non-linearity is most marked at lower stress levels, the failure envelope being of quasi-parabolic shape. It is not easy to calibrate ...It is now recognized that many geomaterials have nonlinear failure envelopes. This non-linearity is most marked at lower stress levels, the failure envelope being of quasi-parabolic shape. It is not easy to calibrate these nonlinear failure envelopes from triaxial test data. Currently only the power-type failure envelope has been studied with an established formal procedure for its determination from triaxial test data. In this paper, a simplified procedure is evolved for the development of four different types of nonlinear envelopes. These are of invaluable assistance in the evaluation of true factors of safety in problems of slope stability and correct computation of lateral earth pressure and bearing capacity. The use of the Mohr-Coulomb failure envelopes leads to an overestimation of the factors of safety and other geotechnical quantities.展开更多
Heckman Sampel Selection Model (PSSM) has been adopted widely in the study of labour work. This model contains exogenous, endogenous and standard error variables. However, this model is constantly exposed to high inac...Heckman Sampel Selection Model (PSSM) has been adopted widely in the study of labour work. This model contains exogenous, endogenous and standard error variables. However, this model is constantly exposed to high inaccuracy of estimation result. Therefore, to obtain an accurate and precise estimation, the bootstrap approach is introduced. The bootstrap approach will be hybrid with PSSM model known as BPSSM to achieve estimation result that is more precise. Then, the BPSSM is applied to Malaysian Population and Family Survey 1994 (MPFS-1994) data. The results showed that BPSSM provide a smaller standard error and shorter confidence intervals.展开更多
Sea surface temperature SST obtained from the initial version of the Korea Operational Oceanographic System(KOOS) SST satellite have low accuracy during summer and daytime. This is attributed to the diurnal warming ...Sea surface temperature SST obtained from the initial version of the Korea Operational Oceanographic System(KOOS) SST satellite have low accuracy during summer and daytime. This is attributed to the diurnal warming effect. Error estimation of SST data must be carried out to use the real-time forecasting numerical model of the KOOS. This study suggests two quality control methods for the KOOS SST system. To minimize the diurnal warming effect, SSTs of areas where wind speed is higher than 5 m/s were used. Depending on the wind threshold value, KOOS SST data for August 2014 were reduced by 0.15°C. Errors in SST data are considered to be a combination of random, sampling, and bias errors. To estimate bias error, the standard deviation of bias between KOOS SSTs and climatology SSTs were used. KOOS SST data yielded an analysis error standard deviation value similar to OSTIA and NOAA NCDC(OISST) data. The KOOS SST shows lower random and sampling errors with increasing number of observations using six satellite datasets. In further studies, the proposed quality control methods for the KOOS SST system will be applied through more long-term case studies and comparisons with other SST systems.展开更多
When there are outliers or heavy-tailed distributions in the data, the traditional least squares with penalty function is no longer applicable. In addition, with the rapid development of science and technology, a lot ...When there are outliers or heavy-tailed distributions in the data, the traditional least squares with penalty function is no longer applicable. In addition, with the rapid development of science and technology, a lot of data, enjoying high dimension, strong correlation and redundancy, has been generated in real life. So it is necessary to find an effective variable selection method for dealing with collinearity based on the robust method. This paper proposes a penalized M-estimation method based on standard error adjusted adaptive elastic-net, which uses M-estimators and the corresponding standard errors as weights. The consistency and asymptotic normality of this method are proved theoretically. For the regularization in high-dimensional space, the authors use the multi-step adaptive elastic-net to reduce the dimension to a relatively large scale which is less than the sample size, and then use the proposed method to select variables and estimate parameters. Finally, the authors carry out simulation studies and two real data analysis to examine the finite sample performance of the proposed method. The results show that the proposed method has some advantages over other commonly used methods.展开更多
The impact of hydro energy production,economic complexity,urbanization,technological innovation and financial development on environmental sustainability between 1995 and 2017 is examined for a panel of thirteen Asian...The impact of hydro energy production,economic complexity,urbanization,technological innovation and financial development on environmental sustainability between 1995 and 2017 is examined for a panel of thirteen Asian economies using two environmental proxies—their ecological footprint and CO_(2)emissions.The non-parametric Driscoll-Kraay standard error method and the Dumitrescu-Hurlin panel causality test are applied to the data.Our findings show that hydro energy production and technological innovation have a significant negative impact on the environment,thus promoting environmental sustainability.Economic complexity significantly lowers environmental sustainability while the non-linear effect of economic complexity favors environmental sustainability;this confirms the existence of an economic-complexity-based inverted-U-shaped environmental Kuznets curve hypothesis.Moreover,urbanization and financial development significantly decrease environmental sustainability.The results of our study confirm the feedback causality between hydro energy production and carbon dioxide emissions.We recommend expansionary policies regarding hydro energy production that are beneficial for substituting fossil fuel energy.This paves a path towards environmental sustainability in this era of global boiling.展开更多
Standard deviation(SD)and standard error of the mean(SEM)have been applied widely as error bars in scientific plots.Unfortunately,there is no universally accepted principle addressing which of these 2 measures should ...Standard deviation(SD)and standard error of the mean(SEM)have been applied widely as error bars in scientific plots.Unfortunately,there is no universally accepted principle addressing which of these 2 measures should be used.Here we seek to fill this gap by outlining the reasoning for choosing SEM over SD and hope to shed light on this unsettled disagreement among the biomedical community.The utility of SEM and SD as error bars is further discussed by examining the figures and plots published in 2 research articles on pancreatic disease.展开更多
For applying the perfect code to transmit quantum information over a noise channel,the standard protocol contains four steps:the encoding,the noise channel,the error-correction operation,and the decoding.In present wo...For applying the perfect code to transmit quantum information over a noise channel,the standard protocol contains four steps:the encoding,the noise channel,the error-correction operation,and the decoding.In present work,we show that this protocol can be simplified.The error-correction operation is not necessary if the decoding is realized by the so-called complete unitary transformation.We also offer a quantum circuit,which can correct the arbitrary single-qubit errors.展开更多
基金Supported by The National Key Basic Research Development Plan(2010CB428602)
文摘By using the data in 169 sounding stations over the world,NCEP/NCAR reanalysis data were tested,and the distribution characteristics of standard errors of geopotential height,temperature and wind speed field from the upper troposphere to the lower stratosphere over the world(most were the land zone) were analyzed.The results showed that the standard error distribution of reanalysis wind speed field data was mainly affected by the jet stream zone.There existed the obvious difference between the jet stream zone and the actual wind field.The distribution of standard error in the wind speed field had the obvious seasonal difference in winter,summer,and the average deviation was larger near the coastline.The high value zones of standard errors of reanalysis geopotential height and temperature field mainly concentrated in the low-latitude region in the Eastern Hemisphere(Indian Ocean coast).The distribution of standard error was basically consistent with average error.Therefore,the standard error could be explained well by the average error.The standard errors of reanalysis temperature and geopotential height data in the inland zone were lower.The high value zone mainly distributed along the coastline,and the average error of wind speed field was bigger near the coastline.It closely related to the quality of data in the sounding stations,the regional difference and the fact that the land observation stations were dense,and the ocean observation stations were fewer.
基金We would like to express our gratitude to the Ministry of Human Resource Development,Govt.of India,for providing us financial support during this study period.
文摘Exclusion from the mainstream financial world is a burden on the poor of many countries.The proliferation of new mobile and online financial services,such as e-banking,money transfers,and payment processing has the potential to provide access to basic financial products and services to financially excluded people.The purpose of this study was to investigate the effects of the growth of mobile phone and Internet use on financial inclusion in the South Asian Association for Regional Cooperation(SAARC)countries from 2004 to 2014.We applied principal component analysis to construct a financial inclusion index that served as a proxy variable for the accessibility of financial services in the SAARC countries.Using three different models-the fixed effect,random effect,and panel correction standard errors modelsthis study discovered a positive and significant relationship between the growth of financial inclusion and expansion of both mobile phone and Internet services.Moreover,an empirical study of the control variables showed that the levels of income and education were positively associated with financial inclusion,whereas the size of the rural population and unemployment were negatively related to financial inclusion.In addition,the empirical estimates posit a unidirectional causal flow from the growth of mobile and Internet services to expanded financial inclusion in the SAARC countries.
基金National Natural Science Foundation projects(31860211)China Postdoctoral Science Foundation Project(2019M653807XB)+2 种基金National Key Research and Development Project of China(2017YFC0504003)Inner Mongolia Agricultural University High-Level Talent Introduction Project(206039)Inner Mongolia Agricultural University Postdoctoral Fund(108950).
文摘Spatial heterogeneity of fuel moisture content determines the spread rate and direction of a forest fire.Research on the spatial heterogeneity of the moisture content of dead fuel of Larix gmelinii Rupr.showed that:(1)fuel moisture content in litter layer<semi-humus layer<humus layer,and the coefficient of variation decreased with sampling depth;(2)the sill value of the semi-humus layer was highest,the humus layer moderate,the litter layer the smallest,overall,the spatial heterogeneity of the semi-humus layer was the highest.The humus layer in the slant direction and three layers in a vertical direction showed strong spatial correlation with the lowest nugget coefficient of 0.0968;(3)the fuel moisture content of the humus layer showed strong spatial anisotropy;and,(4)estimating the total moisture content of the sampling site by stimulated sampling reasonable control of the sampling interval,and increasing the sampling intensity can reduce the error.When the sampling intensity is increased to more than 16 and the sampling interval 3 m,the standard error is<15%.The spatial heterogeneity of fuel moisture content is best revealed by increasing sampling density,sampling in different fire seasons,and in different slope directions and positions.The results can provide a scientific basis for forest fire prediction and prevention.
文摘In recent years there has been an increasing interest in developing spatial statistical models for data sets that are seemingly spatially independent.This lack of spatial structure makes it difficult,if not impossible to use optimal predictors such as ordinary kriging for modeling the spatial variability in the data.In many instances,the data still contain a wealth of information that could be used to gain flexibility and precision in estimation.In this paper we propose using a combination of regression analysis to describe the large-scale spatial variability in a set of survey data and a tree-based stratification design to enhance the estimation process of the small-scale spatial variability.With this approach,sample units(i.e.,pixel of a satellite image) are classified with respect to predictions of error attributes into homogeneous classes,and the classes are then used as strata in the stratified analysis.Independent variables used as a basis of stratification included terrain data and satellite imagery.A decision rule was used to identify a tree size that minimized the error in estimating the variance of the mean response and prediction uncertainties at new spatial locations.This approach was applied to a set of n=937 forested plots from a state-wide inventory conducted in 2006 in the Mexican State of Jalisco.The final models accounted for 62% to 82% of the variability observed in canopy closure(%),basal area(m2·ha-1),cubic volumes(m3·ha-1) and biomass(t·ha-1) on the sample plots.The spatial models provided unbiased estimates and when averaged over all sample units in the population,estimates of forest structure were very close to those obtained using classical estimates based on the sampling strategy used in the state-wide inventory.The spatial models also provided unbiased estimates of model variances leading to confidence and prediction coverage rates close to the 0.95 nominal rate.
文摘We contrast a new continuous approach(CA)for estimating plot-level above-ground biomass(AGB)in forest inventories with the current approach of estimating AGB exclusively from the tree-level AGB predicted for each tree in a plot,henceforth called DA(discrete approach).With the CA,the AGB in a forest is modelled as a continuous surface and the AGB estimate for a fixed-area plot is computed as the integral of the AGB surface taken over the plot area.Hence with the CA,the portion of the biomass of in-plot trees that extends across the plot perimeter is ignored while the biomass from trees outside of the plot reaching inside the plot is added.We use a sampling simulation with data from a fully mapped two hectare area to illustrate that important differences in plot-level AGB estimates can emerge.Ideally CA-based estimates of mean AGB should be less variable than those derived from the DA.If realized,this difference translates to a higher precision from field sampling,or a lower required sample size.In our case study with a target precision of 5%(i.e.relative standard error of the estimated mean AGB),the CA required a 27.1%lower sample size for small plots of 100 m2 and a 10.4%lower sample size for larger plots of 1700 m2.We examined sampling induced errors only and did not yet consider model errors.We discuss practical issues in implementing the CA in field inventories and the potential in applications that model biomass with remote sensing data.The CA is a variation on a plot design for above-ground forest biomass;as such it can be applied in combination with any forest inventory sampling design.
文摘It is now recognized that many geomaterials have nonlinear failure envelopes. This non-linearity is most marked at lower stress levels, the failure envelope being of quasi-parabolic shape. It is not easy to calibrate these nonlinear failure envelopes from triaxial test data. Currently only the power-type failure envelope has been studied with an established formal procedure for its determination from triaxial test data. In this paper, a simplified procedure is evolved for the development of four different types of nonlinear envelopes. These are of invaluable assistance in the evaluation of true factors of safety in problems of slope stability and correct computation of lateral earth pressure and bearing capacity. The use of the Mohr-Coulomb failure envelopes leads to an overestimation of the factors of safety and other geotechnical quantities.
文摘Heckman Sampel Selection Model (PSSM) has been adopted widely in the study of labour work. This model contains exogenous, endogenous and standard error variables. However, this model is constantly exposed to high inaccuracy of estimation result. Therefore, to obtain an accurate and precise estimation, the bootstrap approach is introduced. The bootstrap approach will be hybrid with PSSM model known as BPSSM to achieve estimation result that is more precise. Then, the BPSSM is applied to Malaysian Population and Family Survey 1994 (MPFS-1994) data. The results showed that BPSSM provide a smaller standard error and shorter confidence intervals.
基金A part of the projects titled "Development of Korea Operational Oceanographic System(KOOS),Phase 2","Construction of Ocean Research Stations and their Application Studies","Development of Environmental Information System for NSR Navigation" funded by the Ministry of Oceans and Fisheries,Korea,and "Development of fundamental technology for coastal erosion control" of KIOST
文摘Sea surface temperature SST obtained from the initial version of the Korea Operational Oceanographic System(KOOS) SST satellite have low accuracy during summer and daytime. This is attributed to the diurnal warming effect. Error estimation of SST data must be carried out to use the real-time forecasting numerical model of the KOOS. This study suggests two quality control methods for the KOOS SST system. To minimize the diurnal warming effect, SSTs of areas where wind speed is higher than 5 m/s were used. Depending on the wind threshold value, KOOS SST data for August 2014 were reduced by 0.15°C. Errors in SST data are considered to be a combination of random, sampling, and bias errors. To estimate bias error, the standard deviation of bias between KOOS SSTs and climatology SSTs were used. KOOS SST data yielded an analysis error standard deviation value similar to OSTIA and NOAA NCDC(OISST) data. The KOOS SST shows lower random and sampling errors with increasing number of observations using six satellite datasets. In further studies, the proposed quality control methods for the KOOS SST system will be applied through more long-term case studies and comparisons with other SST systems.
基金supported by the National Natural Science Foundation of China under Grant Nos.12271294,12171225 and 12071248.
文摘When there are outliers or heavy-tailed distributions in the data, the traditional least squares with penalty function is no longer applicable. In addition, with the rapid development of science and technology, a lot of data, enjoying high dimension, strong correlation and redundancy, has been generated in real life. So it is necessary to find an effective variable selection method for dealing with collinearity based on the robust method. This paper proposes a penalized M-estimation method based on standard error adjusted adaptive elastic-net, which uses M-estimators and the corresponding standard errors as weights. The consistency and asymptotic normality of this method are proved theoretically. For the regularization in high-dimensional space, the authors use the multi-step adaptive elastic-net to reduce the dimension to a relatively large scale which is less than the sample size, and then use the proposed method to select variables and estimate parameters. Finally, the authors carry out simulation studies and two real data analysis to examine the finite sample performance of the proposed method. The results show that the proposed method has some advantages over other commonly used methods.
基金Researchers Supporting Project number(RSP2024R87),King Saud University,Riyadh,Saudi Arabia.
文摘The impact of hydro energy production,economic complexity,urbanization,technological innovation and financial development on environmental sustainability between 1995 and 2017 is examined for a panel of thirteen Asian economies using two environmental proxies—their ecological footprint and CO_(2)emissions.The non-parametric Driscoll-Kraay standard error method and the Dumitrescu-Hurlin panel causality test are applied to the data.Our findings show that hydro energy production and technological innovation have a significant negative impact on the environment,thus promoting environmental sustainability.Economic complexity significantly lowers environmental sustainability while the non-linear effect of economic complexity favors environmental sustainability;this confirms the existence of an economic-complexity-based inverted-U-shaped environmental Kuznets curve hypothesis.Moreover,urbanization and financial development significantly decrease environmental sustainability.The results of our study confirm the feedback causality between hydro energy production and carbon dioxide emissions.We recommend expansionary policies regarding hydro energy production that are beneficial for substituting fossil fuel energy.This paves a path towards environmental sustainability in this era of global boiling.
基金BZ research was supported,in part,by the National Institutes of Health grant U24 AA026968the University of Massachusetts Center for Clinical and Translational Science grants UL1TR001453,TL1TR01454,and KL2TR01455.
文摘Standard deviation(SD)and standard error of the mean(SEM)have been applied widely as error bars in scientific plots.Unfortunately,there is no universally accepted principle addressing which of these 2 measures should be used.Here we seek to fill this gap by outlining the reasoning for choosing SEM over SD and hope to shed light on this unsettled disagreement among the biomedical community.The utility of SEM and SD as error bars is further discussed by examining the figures and plots published in 2 research articles on pancreatic disease.
文摘For applying the perfect code to transmit quantum information over a noise channel,the standard protocol contains four steps:the encoding,the noise channel,the error-correction operation,and the decoding.In present work,we show that this protocol can be simplified.The error-correction operation is not necessary if the decoding is realized by the so-called complete unitary transformation.We also offer a quantum circuit,which can correct the arbitrary single-qubit errors.