The purpose of this paper is to obtain the expression of the sample mean difference variance of the Student’s distributive model. In the 2007 the study of the mean difference variance, after some decades, was resumed...The purpose of this paper is to obtain the expression of the sample mean difference variance of the Student’s distributive model. In the 2007 the study of the mean difference variance, after some decades, was resumed by Campobasso</span><span style="font-family:Verdana;"> [1]</span><span style="font-family:Verdana;">. Using the Nair’s </span><span style="font-family:Verdana;">[2]</span><span style="font-family:Verdana;"> and Lomnicki’s general results</span><span style="font-family:Verdana;"> [3]</span><span style="font-family:Verdana;">, he obtained the variance of sample mean difference for different distributive models (Laplace</span><span style="font-family:Verdana;">’</span><span style="font-family:Verdana;">s, triangular, power, logit, Pareto</span><span style="font-family:Verdana;">’</span><span style="font-family:Verdana;">s and Gumbel’s model). In addition he extended the knowledge comparing to the ones already known for the other distributive model (normal, rectangular and exponential model).展开更多
Gobi spans a large area of China,surpassing the combined expanse of mobile dunes and semi-fixed dunes.Its presence significantly influences the movement of sand and dust.However,the complex origins and diverse materia...Gobi spans a large area of China,surpassing the combined expanse of mobile dunes and semi-fixed dunes.Its presence significantly influences the movement of sand and dust.However,the complex origins and diverse materials constituting the Gobi result in notable differences in saltation processes across various Gobi surfaces.It is challenging to describe these processes according to a uniform morphology.Therefore,it becomes imperative to articulate surface characteristics through parameters such as the three-dimensional(3D)size and shape of gravel.Collecting morphology information for Gobi gravels is essential for studying its genesis and sand saltation.To enhance the efficiency and information yield of gravel parameter measurements,this study conducted field experiments in the Gobi region across Dunhuang City,Guazhou County,and Yumen City(administrated by Jiuquan City),Gansu Province,China in March 2023.A research framework and methodology for measuring 3D parameters of gravel using point cloud were developed,alongside improved calculation formulas for 3D parameters including gravel grain size,volume,flatness,roundness,sphericity,and equivalent grain size.Leveraging multi-view geometry technology for 3D reconstruction allowed for establishing an optimal data acquisition scheme characterized by high point cloud reconstruction efficiency and clear quality.Additionally,the proposed methodology incorporated point cloud clustering,segmentation,and filtering techniques to isolate individual gravel point clouds.Advanced point cloud algorithms,including the Oriented Bounding Box(OBB),point cloud slicing method,and point cloud triangulation,were then deployed to calculate the 3D parameters of individual gravels.These systematic processes allow precise and detailed characterization of individual gravels.For gravel grain size and volume,the correlation coefficients between point cloud and manual measurements all exceeded 0.9000,confirming the feasibility of the proposed methodology for measuring 3D parameters of individual gravels.The proposed workflow yields accurate calculations of relevant parameters for Gobi gravels,providing essential data support for subsequent studies on Gobi environments.展开更多
In general the accuracy of mean estimator can be improved by stratified random sampling. In this paper, we provide an idea different from empirical methods that the accuracy can be more improved through bootstrap resa...In general the accuracy of mean estimator can be improved by stratified random sampling. In this paper, we provide an idea different from empirical methods that the accuracy can be more improved through bootstrap resampling method under some conditions. The determination of sample size by bootstrap method is also discussed, and a simulation is made to verify the accuracy of the proposed method. The simulation results show that the sample size based on bootstrapping is smaller than that based on central limit theorem.展开更多
In this paper, auxiliary information is used to determine an estimator of finite population total using nonparametric regression under stratified random sampling. To achieve this, a model-based approach is adopted by ...In this paper, auxiliary information is used to determine an estimator of finite population total using nonparametric regression under stratified random sampling. To achieve this, a model-based approach is adopted by making use of the local polynomial regression estimation to predict the nonsampled values of the survey variable y. The performance of the proposed estimator is investigated against some design-based and model-based regression estimators. The simulation experiments show that the resulting estimator exhibits good properties. Generally, good confidence intervals are seen for the nonparametric regression estimators, and use of the proposed estimator leads to relatively smaller values of RE compared to other estimators.展开更多
In this paper, the problem of nonparametric estimation of finite population quantile function using multiplicative bias correction technique is considered. A robust estimator of the finite population quantile function...In this paper, the problem of nonparametric estimation of finite population quantile function using multiplicative bias correction technique is considered. A robust estimator of the finite population quantile function based on multiplicative bias correction is derived with the aid of a super population model. Most studies have concentrated on kernel smoothers in the estimation of regression functions. This technique has also been applied to various methods of non-parametric estimation of the finite population quantile already under review. A major problem with the use of nonparametric kernel-based regression over a finite interval, such as the estimation of finite population quantities, is bias at boundary points. By correcting the boundary problems associated with previous model-based estimators, the multiplicative bias corrected estimator produced better results in estimating the finite population quantile function. Furthermore, the asymptotic behavior of the proposed estimators </span><span style="font-family:Verdana;">is</span><span style="font-family:Verdana;"> presented</span><span style="font-family:Verdana;">. </span><span style="font-family:Verdana;">It is observed that the estimator is asymptotically unbiased and statistically consistent when certain conditions are satisfied. The simulation results show that the suggested estimator is quite well in terms of relative bias, mean squared error, and relative root mean error. As a result, the multiplicative bias corrected estimator is strongly suggested for survey sampling estimation of the finite population quantile function.展开更多
This research aims to develop a model to enhance lymphatic diseases diagnosis by the use of random forest ensemble machine-learning method trained with a simple sampling scheme. This study has been carried out in two ...This research aims to develop a model to enhance lymphatic diseases diagnosis by the use of random forest ensemble machine-learning method trained with a simple sampling scheme. This study has been carried out in two major phases: feature selection and classification. In the first stage, a number of discriminative features out of 18 were selected using PSO and several feature selection techniques to reduce the features dimension. In the second stage, we applied the random forest ensemble classification scheme to diagnose lymphatic diseases. While making experiments with the selected features, we used original and resampled distributions of the dataset to train random forest classifier. Experimental results demonstrate that the proposed method achieves a remark-able improvement in classification accuracy rate.展开更多
Image matching refers to the process of matching two or more images obtained at different time,different sensors or different conditions through a large number of feature points in the image.At present,image matching ...Image matching refers to the process of matching two or more images obtained at different time,different sensors or different conditions through a large number of feature points in the image.At present,image matching is widely used in target recognition and tracking,indoor positioning and navigation.Local features missing,however,often occurs in color images taken in dark light,making the extracted feature points greatly reduced in number,so as to affect image matching and even fail the target recognition.An unsharp masking(USM)based denoising model is established and a local adaptive enhancement algorithm is proposed to achieve feature point compensation by strengthening local features of the dark image in order to increase amount of image information effectively.Fast library for approximate nearest neighbors(FLANN)and random sample consensus(RANSAC)are image matching algorithms.Experimental results show that the number of effective feature points obtained by the proposed algorithm from images in dark light environment is increased,and the accuracy of image matching can be improved obviously.展开更多
Unlike height-diameter equations for standing trees commonly used in forest resources modelling,tree height models for cut-to-length(CTL)stems tend to produce prediction errors whose distributions are not conditionall...Unlike height-diameter equations for standing trees commonly used in forest resources modelling,tree height models for cut-to-length(CTL)stems tend to produce prediction errors whose distributions are not conditionally normal but are rather leptokurtic and heavy-tailed.This feature was merely noticed in previous studies but never thoroughly investigated.This study characterized the prediction error distribution of a newly developed such tree height model for Pin us radiata(D.Don)through the three-parameter Burr TypeⅫ(BⅫ)distribution.The model’s prediction errors(ε)exhibited heteroskedasticity conditional mainly on the small end relative diameter of the top log and also on DBH to a minor extent.Structured serial correlations were also present in the data.A total of 14 candidate weighting functions were compared to select the best two for weightingεin order to reduce its conditional heteroskedasticity.The weighted prediction errors(εw)were shifted by a constant to the positive range supported by the BXII distribution.Then the distribution of weighted and shifted prediction errors(εw+)was characterized by the BⅫdistribution using maximum likelihood estimation through 1000 times of repeated random sampling,fitting and goodness-of-fit testing,each time by randomly taking only one observation from each tree to circumvent the potential adverse impact of serial correlation in the data on parameter estimation and inferences.The nonparametric two sample Kolmogorov-Smirnov(KS)goodness-of-fit test and its closely related Kuiper’s(KU)test showed the fitted BⅫdistributions provided a good fit to the highly leptokurtic and heavy-tailed distribution ofε.Random samples generated from the fitted BⅫdistributions ofεw+derived from using the best two weighting functions,when back-shifted and unweighted,exhibited distributions that were,in about97 and 95%of the 1000 cases respectively,not statistically different from the distribution ofε.Our results for cut-tolength P.radiata stems represented the first case of any tree species where a non-normal error distribution in tree height prediction was described by an underlying probability distribution.The fitted BXII prediction error distribution will help to unlock the full potential of the new tree height model in forest resources modelling of P.radiata plantations,particularly when uncertainty assessments,statistical inferences and error propagations are needed in research and practical applications through harvester data analytics.展开更多
Traditional models for semantic segmentation in point clouds primarily focus on smaller scales.However,in real-world applications,point clouds often exhibit larger scales,leading to heavy computational and memory requ...Traditional models for semantic segmentation in point clouds primarily focus on smaller scales.However,in real-world applications,point clouds often exhibit larger scales,leading to heavy computational and memory requirements.The key to handling large-scale point clouds lies in leveraging random sampling,which offers higher computational efficiency and lower memory consumption compared to other sampling methods.Nevertheless,the use of random sampling can potentially result in the loss of crucial points during the encoding stage.To address these issues,this paper proposes cross-fusion self-attention network(CFSA-Net),a lightweight and efficient network architecture specifically designed for directly processing large-scale point clouds.At the core of this network is the incorporation of random sampling alongside a local feature extraction module based on cross-fusion self-attention(CFSA).This module effectively integrates long-range contextual dependencies between points by employing hierarchical position encoding(HPC).Furthermore,it enhances the interaction between each point’s coordinates and feature information through cross-fusion self-attention pooling,enabling the acquisition of more comprehensive geometric information.Finally,a residual optimization(RO)structure is introduced to extend the receptive field of individual points by stacking hierarchical position encoding and cross-fusion self-attention pooling,thereby reducing the impact of information loss caused by random sampling.Experimental results on the Stanford Large-Scale 3D Indoor Spaces(S3DIS),Semantic3D,and SemanticKITTI datasets demonstrate the superiority of this algorithm over advanced approaches such as RandLA-Net and KPConv.These findings underscore the excellent performance of CFSA-Net in large-scale 3D semantic segmentation.展开更多
Climate change has become a global phenomenon and is adversely affecting agricultural development across the globe.Developing countries like Pakistan where 18.9%of the GDP(gross domestic product)came from the agricult...Climate change has become a global phenomenon and is adversely affecting agricultural development across the globe.Developing countries like Pakistan where 18.9%of the GDP(gross domestic product)came from the agriculture sector and also 42%of the labor force involved in agriculture.They are directly and indirectly affected by climate change due to an increase in the frequency and intensity of climatic extreme events such as floods,droughts and extreme weather events.In this paper,we have focused on the impact of climate change on farm households and their adaptation strategies to cope up the climatic extremes.For this purpose,we have selected farm households by using multistage stratified random sampling from four districts of the Potohar region i.e.Attock,Rawalpindi,Jhelum and Chakwal.These districts were selected by dividing the Potohar region into rain-fed areas.We have employed logistic regression to assess the determinants of adaptation to climate change and its impact.We have also calculated the marginal effect of each independent variable of the logistic regression to measure the immediate rate of change in the model.In order to check the significance of our suggested model,we have used hypothesis testing.展开更多
The curve of relationship between fatigue crack growth rate and the stress strength factor amplitude represented an important fatigue property in designing of damage tolerance limits and predicting life of metallic co...The curve of relationship between fatigue crack growth rate and the stress strength factor amplitude represented an important fatigue property in designing of damage tolerance limits and predicting life of metallic component parts. In order to have a more reasonable use of testing data, samples from population were stratified suggested by the stratified random sample model (SRAM). The data in each stratum corresponded to the same experiment conditions. A suitable weight was assigned to each stratified sample according to the actual working states of the pressure vessel, so that the estimation of fatigue crack growth rate equation was more accurate for practice. An empirical study shows that the SRAM estimation by using fatigue crack growth rate data from different stoves is obviously better than the estimation from simple random sample model.展开更多
针对双目视觉测距中测量误差大、图像信息单一、实时性差等问题,提出一种基于ORB(oriented fast and rotated brief)特征的双目测距方法。对视频帧进行中值滤波处理,提取图像ORB特征,通过实验选出匹配效果最好的汉明距离。对筛选后的匹...针对双目视觉测距中测量误差大、图像信息单一、实时性差等问题,提出一种基于ORB(oriented fast and rotated brief)特征的双目测距方法。对视频帧进行中值滤波处理,提取图像ORB特征,通过实验选出匹配效果最好的汉明距离。对筛选后的匹配点进行RANSAC(random sample consensus)模型估计,去除误匹配,分析视差和真实距离的模型关系,构建最优的测距模型并在实验平台上进行验证。结果表明:所提方法比其他双目测距方法具有测距精确、运行速度快、鲁棒性强的优势,能够实时显示图中特征的距离信息。展开更多
The 3D reconstruction using deep learning-based intelligent systems can provide great help for measuring an individual’s height and shape quickly and accurately through 2D motion-blurred images.Generally,during the a...The 3D reconstruction using deep learning-based intelligent systems can provide great help for measuring an individual’s height and shape quickly and accurately through 2D motion-blurred images.Generally,during the acquisition of images in real-time,motion blur,caused by camera shaking or human motion,appears.Deep learning-based intelligent control applied in vision can help us solve the problem.To this end,we propose a 3D reconstruction method for motion-blurred images using deep learning.First,we develop a BF-WGAN algorithm that combines the bilateral filtering(BF)denoising theory with a Wasserstein generative adversarial network(WGAN)to remove motion blur.The bilateral filter denoising algorithm is used to remove the noise and to retain the details of the blurred image.Then,the blurred image and the corresponding sharp image are input into the WGAN.This algorithm distinguishes the motion-blurred image from the corresponding sharp image according to the WGAN loss and perceptual loss functions.Next,we use the deblurred images generated by the BFWGAN algorithm for 3D reconstruction.We propose a threshold optimization random sample consensus(TO-RANSAC)algorithm that can remove the wrong relationship between two views in the 3D reconstructed model relatively accurately.Compared with the traditional RANSAC algorithm,the TO-RANSAC algorithm can adjust the threshold adaptively,which improves the accuracy of the 3D reconstruction results.The experimental results show that our BF-WGAN algorithm has a better deblurring effect and higher efficiency than do other representative algorithms.In addition,the TO-RANSAC algorithm yields a calculation accuracy considerably higher than that of the traditional RANSAC algorithm.展开更多
Airborne Light Detection And Ranging(LiDAR)can provide high-quality three-dimensional information for the safety inspection of electricity corridors.However,the robust extraction of transmission lines from airborne po...Airborne Light Detection And Ranging(LiDAR)can provide high-quality three-dimensional information for the safety inspection of electricity corridors.However,the robust extraction of transmission lines from airborne point cloud data is still greatly challenging.Therefore,this paper proposes a robust transmission line extraction method based on model fitting from airborne point cloud data.First,the candidate power line generation method based on height information is used to reduce the computational complexity at the subsequent steps and the false positives in the extracted results.Then,on the basis of the block-and-slice-constraint Euclidean clustering,a linear structure recognition method based on RANdom SAmple Consensus(RANSAC)is proposed to produce the initial individual transmission line components.Finally,a robust nonlinear least square-based fitting method is developed for the individual transmission line to generate the parameters of its mathematical model for further optimizing the extraction.Experiments were performed on LiDAR point cloud data captured from the helicopter and Unmanned Aerial Vehicle(UAV)platform.Results indicate that the proposed method can efficiently extract the different types of transmission lines along electricity corridors,with the average precision of approximately 98.1%,the average recall of approximately 95.9%,and the average quality of approximately 94.2%,respectively.展开更多
During environment testing, the estimation of random vibration signals (RVS) is an important technique for the airborne platform safety and reliability. However, the available meth- ods including extreme value envel...During environment testing, the estimation of random vibration signals (RVS) is an important technique for the airborne platform safety and reliability. However, the available meth- ods including extreme value envelope method (EVEM), statistical tolerances method (STM) and improved statistical tolerance method (ISTM) require large samples and typical probability distri- bution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM) is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated inter- val, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM) and gray method (GM) in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.展开更多
The theory of compressed sensing (CS) provides a new chance to reduce the data acquisition time and improve the data usage factor of the stepped frequency radar system. In light of the sparsity of radar target refle...The theory of compressed sensing (CS) provides a new chance to reduce the data acquisition time and improve the data usage factor of the stepped frequency radar system. In light of the sparsity of radar target reflectivity, two imaging methods based on CS, termed the CS-based 2D joint imaging algorithm and the CS-based 2D decoupled imaging algorithm, are proposed. These methods incorporate the coherent mixing operation into the sparse dictionary, and take random measurements in both range and azimuth directions to get high resolution radar images, thus can remarkably reduce the data rate and simplify the hardware design of the radar system while maintaining imaging quality. Ex- periments from both simulated data and measured data in the anechoic chamber show that the proposed imaging methods can get more focused images than the traditional fast Fourier trans- form method. Wherein the joint algorithm has stronger robustness and can provide clearer inverse synthetic aperture radar images, while the decoupled algorithm is computationally more efficient but has slightly degraded imaging quality, which can be improved by increasing measurements or using a robuster recovery algorithm nevertheless.展开更多
Fishery-independent surveys are often used for collecting high quality biological and ecological data to support fisheries management. A careful optimization of fishery-independent survey design is necessary to improv...Fishery-independent surveys are often used for collecting high quality biological and ecological data to support fisheries management. A careful optimization of fishery-independent survey design is necessary to improve the precision of survey estimates with cost-effective sampling efforts. We developed a simulation approach to evaluate and optimize the stratification scheme for a fishery-independent survey with multiple goals including estimation of abundance indices of individual species and species diversity indices. We compared the performances of the sampling designs with different stratification schemes for different goals over different months. Gains in precision of survey estimates from the stratification schemes were acquired compared to simple random sampling design for most indices. The stratification scheme with five strata performed the best. This study showed that the loss of precision of survey estimates due to the reduction of sampling efforts could be compensated by improved stratification schemes, which would reduce the cost and negative impacts of survey trawling on those species with low abundance in the fishery-independent survey. This study also suggests that optimization of a survey design differed with different survey objectives. A post-survey analysis can improve the stratification scheme of fishery-independent survey designs.展开更多
Direct measurement of snow water equivalent(SWE)in snow-dominated mountainous areas is difficult,thus its prediction is essential for water resources management in such areas.In addition,because of nonlinear trend of ...Direct measurement of snow water equivalent(SWE)in snow-dominated mountainous areas is difficult,thus its prediction is essential for water resources management in such areas.In addition,because of nonlinear trend of snow spatial distribution and the multiple influencing factors concerning the SWE spatial distribution,statistical models are not usually able to present acceptable results.Therefore,applicable methods that are able to predict nonlinear trends are necessary.In this research,to predict SWE,the Sohrevard Watershed located in northwest of Iran was selected as the case study.Database was collected,and the required maps were derived.Snow depth(SD)at 150 points with two sampling patterns including systematic random sampling and Latin hypercube sampling(LHS),and snow density at 18 points were randomly measured,and then SWE was calculated.SWE was predicted using artificial neural network(ANN),adaptive neuro-fuzzy inference system(ANFIS)and regression methods.The results showed that the performance of ANN and ANFIS models with two sampling patterns were observed better than the regression method.Moreover,based on most of the efficiency criteria,the efficiency of ANN,ANFIS and regression methods under LHS pattern were observed higher than the systematic random sampling pattern.However,there were no significant differences between the two methods of ANN and ANFIS in SWE prediction.Data of both two sampling patterns had the highest sensitivity to the elevation.In addition,the LHS and the systematic random sampling patterns had the least sensitivity to the profile curvature and plan curvature,respectively.展开更多
Some basic equations and the relations among various Markov chains are established. These works are the bases in the investigation of the theory of Markov chain in random environment.
文摘The purpose of this paper is to obtain the expression of the sample mean difference variance of the Student’s distributive model. In the 2007 the study of the mean difference variance, after some decades, was resumed by Campobasso</span><span style="font-family:Verdana;"> [1]</span><span style="font-family:Verdana;">. Using the Nair’s </span><span style="font-family:Verdana;">[2]</span><span style="font-family:Verdana;"> and Lomnicki’s general results</span><span style="font-family:Verdana;"> [3]</span><span style="font-family:Verdana;">, he obtained the variance of sample mean difference for different distributive models (Laplace</span><span style="font-family:Verdana;">’</span><span style="font-family:Verdana;">s, triangular, power, logit, Pareto</span><span style="font-family:Verdana;">’</span><span style="font-family:Verdana;">s and Gumbel’s model). In addition he extended the knowledge comparing to the ones already known for the other distributive model (normal, rectangular and exponential model).
基金funded by the National Natural Science Foundation of China(42071014).
文摘Gobi spans a large area of China,surpassing the combined expanse of mobile dunes and semi-fixed dunes.Its presence significantly influences the movement of sand and dust.However,the complex origins and diverse materials constituting the Gobi result in notable differences in saltation processes across various Gobi surfaces.It is challenging to describe these processes according to a uniform morphology.Therefore,it becomes imperative to articulate surface characteristics through parameters such as the three-dimensional(3D)size and shape of gravel.Collecting morphology information for Gobi gravels is essential for studying its genesis and sand saltation.To enhance the efficiency and information yield of gravel parameter measurements,this study conducted field experiments in the Gobi region across Dunhuang City,Guazhou County,and Yumen City(administrated by Jiuquan City),Gansu Province,China in March 2023.A research framework and methodology for measuring 3D parameters of gravel using point cloud were developed,alongside improved calculation formulas for 3D parameters including gravel grain size,volume,flatness,roundness,sphericity,and equivalent grain size.Leveraging multi-view geometry technology for 3D reconstruction allowed for establishing an optimal data acquisition scheme characterized by high point cloud reconstruction efficiency and clear quality.Additionally,the proposed methodology incorporated point cloud clustering,segmentation,and filtering techniques to isolate individual gravel point clouds.Advanced point cloud algorithms,including the Oriented Bounding Box(OBB),point cloud slicing method,and point cloud triangulation,were then deployed to calculate the 3D parameters of individual gravels.These systematic processes allow precise and detailed characterization of individual gravels.For gravel grain size and volume,the correlation coefficients between point cloud and manual measurements all exceeded 0.9000,confirming the feasibility of the proposed methodology for measuring 3D parameters of individual gravels.The proposed workflow yields accurate calculations of relevant parameters for Gobi gravels,providing essential data support for subsequent studies on Gobi environments.
基金The Science Research Start-up Foundation for Young Teachers of Southwest Jiaotong University(No.2007Q091)
文摘In general the accuracy of mean estimator can be improved by stratified random sampling. In this paper, we provide an idea different from empirical methods that the accuracy can be more improved through bootstrap resampling method under some conditions. The determination of sample size by bootstrap method is also discussed, and a simulation is made to verify the accuracy of the proposed method. The simulation results show that the sample size based on bootstrapping is smaller than that based on central limit theorem.
文摘In this paper, auxiliary information is used to determine an estimator of finite population total using nonparametric regression under stratified random sampling. To achieve this, a model-based approach is adopted by making use of the local polynomial regression estimation to predict the nonsampled values of the survey variable y. The performance of the proposed estimator is investigated against some design-based and model-based regression estimators. The simulation experiments show that the resulting estimator exhibits good properties. Generally, good confidence intervals are seen for the nonparametric regression estimators, and use of the proposed estimator leads to relatively smaller values of RE compared to other estimators.
文摘In this paper, the problem of nonparametric estimation of finite population quantile function using multiplicative bias correction technique is considered. A robust estimator of the finite population quantile function based on multiplicative bias correction is derived with the aid of a super population model. Most studies have concentrated on kernel smoothers in the estimation of regression functions. This technique has also been applied to various methods of non-parametric estimation of the finite population quantile already under review. A major problem with the use of nonparametric kernel-based regression over a finite interval, such as the estimation of finite population quantities, is bias at boundary points. By correcting the boundary problems associated with previous model-based estimators, the multiplicative bias corrected estimator produced better results in estimating the finite population quantile function. Furthermore, the asymptotic behavior of the proposed estimators </span><span style="font-family:Verdana;">is</span><span style="font-family:Verdana;"> presented</span><span style="font-family:Verdana;">. </span><span style="font-family:Verdana;">It is observed that the estimator is asymptotically unbiased and statistically consistent when certain conditions are satisfied. The simulation results show that the suggested estimator is quite well in terms of relative bias, mean squared error, and relative root mean error. As a result, the multiplicative bias corrected estimator is strongly suggested for survey sampling estimation of the finite population quantile function.
文摘This research aims to develop a model to enhance lymphatic diseases diagnosis by the use of random forest ensemble machine-learning method trained with a simple sampling scheme. This study has been carried out in two major phases: feature selection and classification. In the first stage, a number of discriminative features out of 18 were selected using PSO and several feature selection techniques to reduce the features dimension. In the second stage, we applied the random forest ensemble classification scheme to diagnose lymphatic diseases. While making experiments with the selected features, we used original and resampled distributions of the dataset to train random forest classifier. Experimental results demonstrate that the proposed method achieves a remark-able improvement in classification accuracy rate.
基金Supported by the National Natural Science Foundation of China(No.61771186)the Heilongjiang Provincial Natural Science Foundation of China(No.YQ2020F012)the University Nursing Program for Young Scholars with Creative Talents in Heilongjiang Province(No.UNPYSCT-2017125).
文摘Image matching refers to the process of matching two or more images obtained at different time,different sensors or different conditions through a large number of feature points in the image.At present,image matching is widely used in target recognition and tracking,indoor positioning and navigation.Local features missing,however,often occurs in color images taken in dark light,making the extracted feature points greatly reduced in number,so as to affect image matching and even fail the target recognition.An unsharp masking(USM)based denoising model is established and a local adaptive enhancement algorithm is proposed to achieve feature point compensation by strengthening local features of the dark image in order to increase amount of image information effectively.Fast library for approximate nearest neighbors(FLANN)and random sample consensus(RANSAC)are image matching algorithms.Experimental results show that the number of effective feature points obtained by the proposed algorithm from images in dark light environment is increased,and the accuracy of image matching can be improved obviously.
文摘Unlike height-diameter equations for standing trees commonly used in forest resources modelling,tree height models for cut-to-length(CTL)stems tend to produce prediction errors whose distributions are not conditionally normal but are rather leptokurtic and heavy-tailed.This feature was merely noticed in previous studies but never thoroughly investigated.This study characterized the prediction error distribution of a newly developed such tree height model for Pin us radiata(D.Don)through the three-parameter Burr TypeⅫ(BⅫ)distribution.The model’s prediction errors(ε)exhibited heteroskedasticity conditional mainly on the small end relative diameter of the top log and also on DBH to a minor extent.Structured serial correlations were also present in the data.A total of 14 candidate weighting functions were compared to select the best two for weightingεin order to reduce its conditional heteroskedasticity.The weighted prediction errors(εw)were shifted by a constant to the positive range supported by the BXII distribution.Then the distribution of weighted and shifted prediction errors(εw+)was characterized by the BⅫdistribution using maximum likelihood estimation through 1000 times of repeated random sampling,fitting and goodness-of-fit testing,each time by randomly taking only one observation from each tree to circumvent the potential adverse impact of serial correlation in the data on parameter estimation and inferences.The nonparametric two sample Kolmogorov-Smirnov(KS)goodness-of-fit test and its closely related Kuiper’s(KU)test showed the fitted BⅫdistributions provided a good fit to the highly leptokurtic and heavy-tailed distribution ofε.Random samples generated from the fitted BⅫdistributions ofεw+derived from using the best two weighting functions,when back-shifted and unweighted,exhibited distributions that were,in about97 and 95%of the 1000 cases respectively,not statistically different from the distribution ofε.Our results for cut-tolength P.radiata stems represented the first case of any tree species where a non-normal error distribution in tree height prediction was described by an underlying probability distribution.The fitted BXII prediction error distribution will help to unlock the full potential of the new tree height model in forest resources modelling of P.radiata plantations,particularly when uncertainty assessments,statistical inferences and error propagations are needed in research and practical applications through harvester data analytics.
基金funded by the National Natural Science Foundation of China Youth Project(61603127).
文摘Traditional models for semantic segmentation in point clouds primarily focus on smaller scales.However,in real-world applications,point clouds often exhibit larger scales,leading to heavy computational and memory requirements.The key to handling large-scale point clouds lies in leveraging random sampling,which offers higher computational efficiency and lower memory consumption compared to other sampling methods.Nevertheless,the use of random sampling can potentially result in the loss of crucial points during the encoding stage.To address these issues,this paper proposes cross-fusion self-attention network(CFSA-Net),a lightweight and efficient network architecture specifically designed for directly processing large-scale point clouds.At the core of this network is the incorporation of random sampling alongside a local feature extraction module based on cross-fusion self-attention(CFSA).This module effectively integrates long-range contextual dependencies between points by employing hierarchical position encoding(HPC).Furthermore,it enhances the interaction between each point’s coordinates and feature information through cross-fusion self-attention pooling,enabling the acquisition of more comprehensive geometric information.Finally,a residual optimization(RO)structure is introduced to extend the receptive field of individual points by stacking hierarchical position encoding and cross-fusion self-attention pooling,thereby reducing the impact of information loss caused by random sampling.Experimental results on the Stanford Large-Scale 3D Indoor Spaces(S3DIS),Semantic3D,and SemanticKITTI datasets demonstrate the superiority of this algorithm over advanced approaches such as RandLA-Net and KPConv.These findings underscore the excellent performance of CFSA-Net in large-scale 3D semantic segmentation.
文摘Climate change has become a global phenomenon and is adversely affecting agricultural development across the globe.Developing countries like Pakistan where 18.9%of the GDP(gross domestic product)came from the agriculture sector and also 42%of the labor force involved in agriculture.They are directly and indirectly affected by climate change due to an increase in the frequency and intensity of climatic extreme events such as floods,droughts and extreme weather events.In this paper,we have focused on the impact of climate change on farm households and their adaptation strategies to cope up the climatic extremes.For this purpose,we have selected farm households by using multistage stratified random sampling from four districts of the Potohar region i.e.Attock,Rawalpindi,Jhelum and Chakwal.These districts were selected by dividing the Potohar region into rain-fed areas.We have employed logistic regression to assess the determinants of adaptation to climate change and its impact.We have also calculated the marginal effect of each independent variable of the logistic regression to measure the immediate rate of change in the model.In order to check the significance of our suggested model,we have used hypothesis testing.
文摘The curve of relationship between fatigue crack growth rate and the stress strength factor amplitude represented an important fatigue property in designing of damage tolerance limits and predicting life of metallic component parts. In order to have a more reasonable use of testing data, samples from population were stratified suggested by the stratified random sample model (SRAM). The data in each stratum corresponded to the same experiment conditions. A suitable weight was assigned to each stratified sample according to the actual working states of the pressure vessel, so that the estimation of fatigue crack growth rate equation was more accurate for practice. An empirical study shows that the SRAM estimation by using fatigue crack growth rate data from different stoves is obviously better than the estimation from simple random sample model.
文摘针对双目视觉测距中测量误差大、图像信息单一、实时性差等问题,提出一种基于ORB(oriented fast and rotated brief)特征的双目测距方法。对视频帧进行中值滤波处理,提取图像ORB特征,通过实验选出匹配效果最好的汉明距离。对筛选后的匹配点进行RANSAC(random sample consensus)模型估计,去除误匹配,分析视差和真实距离的模型关系,构建最优的测距模型并在实验平台上进行验证。结果表明:所提方法比其他双目测距方法具有测距精确、运行速度快、鲁棒性强的优势,能够实时显示图中特征的距离信息。
基金the National Natural Science Foundation of China under Grant 61902311in part by the Japan Society for the Promotion of Science(JSPS)Grants-in-Aid for Scientific Research(KAKENHI)under Grant JP18K18044.
文摘The 3D reconstruction using deep learning-based intelligent systems can provide great help for measuring an individual’s height and shape quickly and accurately through 2D motion-blurred images.Generally,during the acquisition of images in real-time,motion blur,caused by camera shaking or human motion,appears.Deep learning-based intelligent control applied in vision can help us solve the problem.To this end,we propose a 3D reconstruction method for motion-blurred images using deep learning.First,we develop a BF-WGAN algorithm that combines the bilateral filtering(BF)denoising theory with a Wasserstein generative adversarial network(WGAN)to remove motion blur.The bilateral filter denoising algorithm is used to remove the noise and to retain the details of the blurred image.Then,the blurred image and the corresponding sharp image are input into the WGAN.This algorithm distinguishes the motion-blurred image from the corresponding sharp image according to the WGAN loss and perceptual loss functions.Next,we use the deblurred images generated by the BFWGAN algorithm for 3D reconstruction.We propose a threshold optimization random sample consensus(TO-RANSAC)algorithm that can remove the wrong relationship between two views in the 3D reconstructed model relatively accurately.Compared with the traditional RANSAC algorithm,the TO-RANSAC algorithm can adjust the threshold adaptively,which improves the accuracy of the 3D reconstruction results.The experimental results show that our BF-WGAN algorithm has a better deblurring effect and higher efficiency than do other representative algorithms.In addition,the TO-RANSAC algorithm yields a calculation accuracy considerably higher than that of the traditional RANSAC algorithm.
基金National Natural Science Foundation of China(No.41872207).
文摘Airborne Light Detection And Ranging(LiDAR)can provide high-quality three-dimensional information for the safety inspection of electricity corridors.However,the robust extraction of transmission lines from airborne point cloud data is still greatly challenging.Therefore,this paper proposes a robust transmission line extraction method based on model fitting from airborne point cloud data.First,the candidate power line generation method based on height information is used to reduce the computational complexity at the subsequent steps and the false positives in the extracted results.Then,on the basis of the block-and-slice-constraint Euclidean clustering,a linear structure recognition method based on RANdom SAmple Consensus(RANSAC)is proposed to produce the initial individual transmission line components.Finally,a robust nonlinear least square-based fitting method is developed for the individual transmission line to generate the parameters of its mathematical model for further optimizing the extraction.Experiments were performed on LiDAR point cloud data captured from the helicopter and Unmanned Aerial Vehicle(UAV)platform.Results indicate that the proposed method can efficiently extract the different types of transmission lines along electricity corridors,with the average precision of approximately 98.1%,the average recall of approximately 95.9%,and the average quality of approximately 94.2%,respectively.
基金supported by Aviation Science Foundation of China (No. 20100251006)the Technological Foundation Project (No. J132012C001)
文摘During environment testing, the estimation of random vibration signals (RVS) is an important technique for the airborne platform safety and reliability. However, the available meth- ods including extreme value envelope method (EVEM), statistical tolerances method (STM) and improved statistical tolerance method (ISTM) require large samples and typical probability distri- bution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM) is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated inter- val, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM) and gray method (GM) in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.
基金supported by the Prominent Youth Fund of the National Natural Science Foundation of China (61025006)
文摘The theory of compressed sensing (CS) provides a new chance to reduce the data acquisition time and improve the data usage factor of the stepped frequency radar system. In light of the sparsity of radar target reflectivity, two imaging methods based on CS, termed the CS-based 2D joint imaging algorithm and the CS-based 2D decoupled imaging algorithm, are proposed. These methods incorporate the coherent mixing operation into the sparse dictionary, and take random measurements in both range and azimuth directions to get high resolution radar images, thus can remarkably reduce the data rate and simplify the hardware design of the radar system while maintaining imaging quality. Ex- periments from both simulated data and measured data in the anechoic chamber show that the proposed imaging methods can get more focused images than the traditional fast Fourier trans- form method. Wherein the joint algorithm has stronger robustness and can provide clearer inverse synthetic aperture radar images, while the decoupled algorithm is computationally more efficient but has slightly degraded imaging quality, which can be improved by increasing measurements or using a robuster recovery algorithm nevertheless.
基金The Public Science and Technology Research Funds Projects of Ocean under contract No.201305030the Specialized Research Fund for the Doctoral Program of Higher Education under contract No.20120132130001
文摘Fishery-independent surveys are often used for collecting high quality biological and ecological data to support fisheries management. A careful optimization of fishery-independent survey design is necessary to improve the precision of survey estimates with cost-effective sampling efforts. We developed a simulation approach to evaluate and optimize the stratification scheme for a fishery-independent survey with multiple goals including estimation of abundance indices of individual species and species diversity indices. We compared the performances of the sampling designs with different stratification schemes for different goals over different months. Gains in precision of survey estimates from the stratification schemes were acquired compared to simple random sampling design for most indices. The stratification scheme with five strata performed the best. This study showed that the loss of precision of survey estimates due to the reduction of sampling efforts could be compensated by improved stratification schemes, which would reduce the cost and negative impacts of survey trawling on those species with low abundance in the fishery-independent survey. This study also suggests that optimization of a survey design differed with different survey objectives. A post-survey analysis can improve the stratification scheme of fishery-independent survey designs.
文摘Direct measurement of snow water equivalent(SWE)in snow-dominated mountainous areas is difficult,thus its prediction is essential for water resources management in such areas.In addition,because of nonlinear trend of snow spatial distribution and the multiple influencing factors concerning the SWE spatial distribution,statistical models are not usually able to present acceptable results.Therefore,applicable methods that are able to predict nonlinear trends are necessary.In this research,to predict SWE,the Sohrevard Watershed located in northwest of Iran was selected as the case study.Database was collected,and the required maps were derived.Snow depth(SD)at 150 points with two sampling patterns including systematic random sampling and Latin hypercube sampling(LHS),and snow density at 18 points were randomly measured,and then SWE was calculated.SWE was predicted using artificial neural network(ANN),adaptive neuro-fuzzy inference system(ANFIS)and regression methods.The results showed that the performance of ANN and ANFIS models with two sampling patterns were observed better than the regression method.Moreover,based on most of the efficiency criteria,the efficiency of ANN,ANFIS and regression methods under LHS pattern were observed higher than the systematic random sampling pattern.However,there were no significant differences between the two methods of ANN and ANFIS in SWE prediction.Data of both two sampling patterns had the highest sensitivity to the elevation.In addition,the LHS and the systematic random sampling patterns had the least sensitivity to the profile curvature and plan curvature,respectively.
基金the National Natural Science Foundation of China(10 0 710 5 8-2 ) and Doctoral Programme Foundationof China
文摘Some basic equations and the relations among various Markov chains are established. These works are the bases in the investigation of the theory of Markov chain in random environment.