AIM To establish minimum clinically important difference(MCID) for measurements in an orthopaedic patient population with joint disorders.METHODS Adult patients aged 18 years and older seeking care for joint condition...AIM To establish minimum clinically important difference(MCID) for measurements in an orthopaedic patient population with joint disorders.METHODS Adult patients aged 18 years and older seeking care for joint conditions at an orthopaedic clinic took the Patient-Reported Outcomes Measurement Information System Physical Function(PROMIS~? PF) computerized adaptive test(CAT), hip disability and osteoarthritis outcome score for joint reconstruction(HOOS JR), and the knee injury and osteoarthritis outcome score for joint reconstruction(KOOS JR) from February 2014 to April 2017. MCIDs were calculated using anchorbased and distribution-based methods. Patient reports of meaningful change in function since their first clinic encounter were used as an anchor.RESULTS There were 2226 patients who participated with a mean age of 61.16(SD = 12.84) years, 41.6% male, and 89.7% Caucasian. Mean change ranged from 7.29 to 8.41 for the PROMIS~? PF CAT, from 14.81 to 19.68 for the HOOS JR, and from 14.51 to 18.85 for the KOOS JR. ROC cut-offs ranged from 1.97-8.18 for the PF CAT, 6.33-43.36 for the HOOS JR, and 2.21-8.16 for the KOOS JR. Distribution-based methods estimated MCID values ranging from 2.45 to 21.55 for the PROMIS~? PF CAT; from 3.90 to 43.61 for the HOOS JR, and from 3.98 to 40.67 for the KOOS JR. The median MCID value in the range was similar to the mean change score for each measure and was 7.9 for the PF CAT, 18.0 for the HOOS JR, and 15.1 for the KOOS JR.CONCLUSION This is the first comprehensive study providing a wide range of MCIDs for the PROMIS? PF, HOOS JR, and KOOS JR in orthopaedic patients with joint ailments.展开更多
In this study we have proposed a modified ratio type estimator for population variance of the study variable y under simple random sampling without replacement making use of coefficient of kurtosis and median of an au...In this study we have proposed a modified ratio type estimator for population variance of the study variable y under simple random sampling without replacement making use of coefficient of kurtosis and median of an auxiliary variable x. The estimator’s properties have been derived up to first order of Taylor’s series expansion. The efficiency conditions derived theoretically under which the proposed estimator performs better than existing estimators. Empirical studies have been done using real populations to demonstrate the performance of the developed estimator in comparison with the existing estimators. The proposed estimator as illustrated by the empirical studies performs better than the existing estimators under some specified conditions i.e. it has the smallest Mean Squared Error and the highest Percentage Relative Efficiency. The developed estimator therefore is suitable to be applied to situations in which the variable of interest has a positive correlation with the auxiliary variable.展开更多
提出了一种基于最小二乘支持向量机的织物剪切性能预测模型,并且采用遗传算法进行最小二乘支持向量机的参数优化,将获得的样本进行归一化处理后,将其输入预测模型以得到预测结果.仿真结果表明,基于最小二乘支持向量机的预测模型比BP神...提出了一种基于最小二乘支持向量机的织物剪切性能预测模型,并且采用遗传算法进行最小二乘支持向量机的参数优化,将获得的样本进行归一化处理后,将其输入预测模型以得到预测结果.仿真结果表明,基于最小二乘支持向量机的预测模型比BP神经网络和线性回归方法具有更高的精度和范化能力.
Abstract:
A new method is proposed to predict the fabric shearing property with least square support vector machines ( LS-SVM ). The genetic algorithm is investigated to select the parameters of LS-SVM models as a means of improving the LS- SVM prediction. After normalizing the sampling data, the sampling data are inputted into the model to gain the prediction result. The simulation results show the prediction model gives better forecasting accuracy and generalization ability than BP neural network and linear regression method.展开更多
This paper deals with Bayesian inference and prediction problems of the Burr type XII distribution based on progressive first failure censored data. We consider the Bayesian inference under a squared error loss functi...This paper deals with Bayesian inference and prediction problems of the Burr type XII distribution based on progressive first failure censored data. We consider the Bayesian inference under a squared error loss function. We propose to apply Gibbs sampling procedure to draw Markov Chain Monte Carlo (MCMC) samples, and they have in turn, been used to compute the Bayes estimates with the help of importance sampling technique. We have performed a simulation study in order to compare the proposed Bayes estimators with the maximum likelihood estimators. We further consider two sample Bayes prediction to predicting future order statistics and upper record values from Burr type XII distribution based on progressive first failure censored data. The predictive densities are obtained and used to determine prediction intervals for unobserved order statistics and upper record values. A real life data set is used to illustrate the results derived.展开更多
Adaptive cluster sampling (ACS) has been a very important tool in estimation of population parameters of rare and clustered population. The fundamental idea behind this sampling plan is to decide on an initial sample ...Adaptive cluster sampling (ACS) has been a very important tool in estimation of population parameters of rare and clustered population. The fundamental idea behind this sampling plan is to decide on an initial sample from a defined population and to keep on sampling within the vicinity of the units that satisfy the condition that at least one characteristic of interest exists in a unit selected in the initial sample. Despite being an important tool for sampling rare and clustered population, adaptive cluster sampling design is unable to control the final sample size when no prior knowledge of the population is available. Thus adaptive cluster sampling with data-driven stopping rule (ACS’) was proposed to control the final sample size when prior knowledge of population structure is not available. This study examined the behavior of the HT, and HH estimator under the ACS design and ACS’ design using artificial population that is designed to have all the characteristics of a rare and clustered population. The efficiencies of the HT and HH estimator were used to determine the most efficient design in estimation of population mean in rare and clustered population. Results of both the simulated data and the real data show that the adaptive cluster sampling with stopping rule is more efficient for estimation of rare and clustered population than ordinary adaptive cluster sampling.展开更多
This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of t...This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of the product based on the time truncated life test employing theWeibull distribution.The control chart developed supports the examination of the mean lifespan variation for a particular product in the process of manufacturing.Three control limit levels are used:the warning control limit,inner control limit,and outer control limit.Together,they enhance the capability for variation detection.A genetic algorithm can be used for optimization during the in-control process,whereby the optimal parameters can be established for the proposed control chart.The control chart performance is assessed using the average run length,while the influence of the model parameters upon the control chart solution is assessed via sensitivity analysis based on an orthogonal experimental design withmultiple linear regression.A comparative study was conducted based on the out-of-control average run length,in which the developed control chart offered greater sensitivity in the detection of process shifts while making use of smaller samples on average than is the case for existing control charts.Finally,to exhibit the utility of the developed control chart,this paper presents its application using simulated data with parameters drawn from the real set of data.展开更多
This study analyzes the sample influx (samples per case file) into forensic science laboratory (FSL) and the corresponding analysis costs and uses arbitrary re-sampling plans to establish the minimum cost function. Th...This study analyzes the sample influx (samples per case file) into forensic science laboratory (FSL) and the corresponding analysis costs and uses arbitrary re-sampling plans to establish the minimum cost function. The demand for forensic analysis increased for all disciplines, especially biology/DNA between 2014 and 2015. While the average distribution of case files was about 42.5%, 40.6% and 17% for the three disciplines, the distribution of samples was rather different being 12%, 82.5% and 5.5% for samples requiring forensic biology, chemistry and toxicology analysis, respectively. Results show that most of the analysis workload was on forensic chemistry analysis. The cost of analysis for case files and the corresponding sample influx varied in the ratio of 35:6:1 and 28:12:1 for forensic chemistry, biology/DNA and toxicology for year 2014 for 2015, respectively. In the two consecutive years, the cost for forensic chemistry analysis was comparatively very high, necessitating re-sampling. The time series of sample influx in all disciplines are strongly stochastic, with higher magnitude for chemistry, biology/DNA and toxicology, in this order. The PDFs of sample influx data are highly skewed to the right, especially forensic toxicology and biology/DNA with peaks at 1 and 3 samples per case file. The arbitrary re-sampling plans were best suited to forensic chemistry case files (where re-sampling conditions apply). The locus of arbitrary number of samples to take from the submitted forensic samples was used to establish the minimum and scientifically acceptable samples by applying minimization function developed in this paper. The cost minimization function was also developed based on the average cost per sample and choice of re-sampling plans depending on the range of sample influx, from which the savings were determined and maximized. Thus, the study gives a forensic scientist a business model and scientific decision making tool on minimum number of samples to analyze focusing on savings on analysis cost.展开更多
We propose a new framework for the sampling,compression,and analysis of distributions of point sets and other geometric objects embedded in Euclidean spaces.Our approach involves constructing a tensor called the RaySe...We propose a new framework for the sampling,compression,and analysis of distributions of point sets and other geometric objects embedded in Euclidean spaces.Our approach involves constructing a tensor called the RaySense sketch,which captures nearest neighbors from the underlying geometry of points along a set of rays.We explore various operations that can be performed on the RaySense sketch,leading to different properties and potential applications.Statistical information about the data set can be extracted from the sketch,independent of the ray set.Line integrals on point sets can be efficiently computed using the sketch.We also present several examples illustrating applications of the proposed strategy in practical scenarios.展开更多
The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenz...The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenzhou City,Southeast China.Two types of landslides samples,combined with seven non-landslide sampling strategies,resulted in a total of 14 scenarios.The corresponding landslide susceptibility map(LSM)for each scenario was generated using the random forest model.The receiver operating characteristic(ROC)curve and statistical indicators were calculated and used to assess the impact of the dataset sampling strategy.The results showed that higher accuracies were achieved when using the landslide core as positive samples,combined with non-landslide sampling from the very low zone or buffer zone.The results reveal the influence of landslide and non-landslide sampling strategies on the accuracy of LSA,which provides a reference for subsequent researchers aiming to obtain a more reasonable LSM.展开更多
There exists a great variety of posturographic parameters which complicates the evaluation of center of pressure (COP) data. Hence, recommendations were given to use a set of complementary parameters to explain most o...There exists a great variety of posturographic parameters which complicates the evaluation of center of pressure (COP) data. Hence, recommendations were given to use a set of complementary parameters to explain most of the variance. However, it is unknown whether a dual task paradigm leads to different parametrization sets. On account of this problem an exploratory factor analysis approach was conducted in a dual task experiment. 16 healthy subjects stood on a force plate performing a posture-cognition dual task (DT, focus of attention on a secondary task) with respect to different sampling durations. The subjects were not aware of being measured in contrast to a baseline task condition (BT, internal focus of attention) in the previously published part I. In compareson to BT a different factor loading pattern appears. In addition, factor loadings are strongly affected by different sampling durations. DT reveals a change of factor loading structure with longer sampling durations compared to BT. Specific recommendations concerning a framework of posturographic parametrization are given.展开更多
It is a challenge in the field sampling to face conflict between the statistical requirements and the logistical constraints when explicitly estimating the macrobenthos species richness in the heterogeneous intertidal...It is a challenge in the field sampling to face conflict between the statistical requirements and the logistical constraints when explicitly estimating the macrobenthos species richness in the heterogeneous intertidal wetlands. To solve this problem, this study tried to design an optimal, efficient and practical sampling strategy by comprehensively focusing on the three main parts of the entire process(to optimize the sampling method, to determine the minimum sampling effort and to explore the proper sampling interval) in a typical intertidal wetland of the Changjiang(Yangtze) Estuary, China. Transect sampling was selected and optimized by stratification based on pronounced habitat types(tidal flat, tidal creek, salt marsh vegetation). This type of sampling is also termed within-transect stratification sampling. The optimal sampling intervals and the minimum sample effort were determined by two beneficial numerical methods: Monte Carlo simulations and accumulative species curves. The results show that the within-transect stratification sampling with typical habitat types was effective for encompassing 81% of the species, suggesting that this type of sampling design can largely reduce the sampling effort and labor. The optimal sampling intervals and minimum sampling efforts for three habitats were determined: sampling effort must exceed 1.8 m^2 by 10 m intervals in the salt marsh vegetation, 2 m^2 by 10 m intervals in the tidal flat, and 3 m^2 by 1 m intervals in the tidal creek habitat. It was suggested that the differences were influenced by the mobility range of the dominant species and the habitats' physical differences(e.g., tidal water, substrate, vegetation cover). The optimized sampling strategy could provide good precision in the richness estimation of macrobenthos and balance the sampling effort. Moreover, the conclusions presented here provide a reference for recommendations to consider before macrobenthic surveys take place in estuarine wetlands. The sampling strategy, focusing on the three key parts of the sampling design, had a good operational effect and could be used as a guide for field sampling for habitat management or ecosystem assessment.展开更多
AIM: To investigate the Chinese version of the Low Vision Quality of Life Questionnaire(CLVQOL) as an instrument for obtaining clinically important changes after cataract surgery.METHODS: Patients underwent cataract s...AIM: To investigate the Chinese version of the Low Vision Quality of Life Questionnaire(CLVQOL) as an instrument for obtaining clinically important changes after cataract surgery.METHODS: Patients underwent cataract surgery in Shanghai General Hospital, Shanghai Jiao Tong University, who fit the inclusion criteria were recruited. Two CLVQOLs were administered, including a preoperative CLVQOL and a CLVQOL at the end of the 3 mo follow-up period, and were completed using face-to-face interviews or phone interviews conducted by trained investigators. The minimal clinically important difference(MCID) was calculated using an anchor-based method and a distribution method. In addition, the responsiveness of the questionnaire was measured.RESULTS: A total of 155 residents were enrolled. The average visual acuity(VA) preoperatively was 0.08(SD=0.05), and it increased to 0.47(SD=0.28) at the end of followup. Statistically significant positive changes in the CLVQOL scores indicated significant improvement of vision related quality of life after cataract surgery. With the larger value between the two results as the final value, the MCID values of the CLVQOL(scores of the four scales as well as the total score) were 8.94, 2.61, 4.34, 3.10 and 17.63, respectively. The CLVQOL has both good internal and external responsiveness.CONCLUSION: CLVQOL scores are appropriate instruments for obtaining clinically important changes after cataract surgery. This study is an effective exploration for establishingcataract surgery efficacy standards, which helps clinical and scientific research workers in ophthalmology to gain a more in-depth understanding when using CLVQOL.展开更多
For the problem of slow search and tortuous paths in the Rapidly Exploring Random Tree(RRT)algorithm,a feedback-biased sampling RRT,called FS-RRT,is proposedbasedon RRT.Firstly,toimprove the samplingefficiency of RRT ...For the problem of slow search and tortuous paths in the Rapidly Exploring Random Tree(RRT)algorithm,a feedback-biased sampling RRT,called FS-RRT,is proposedbasedon RRT.Firstly,toimprove the samplingefficiency of RRT to shorten the search time,the search area of the randomtree is restricted to improve the sampling efficiency.Secondly,to obtain better information about obstacles to shorten the path length,a feedback-biased sampling strategy is used instead of the traditional random sampling,the collision of the expanding node with an obstacle generates feedback information so that the next expanding node avoids expanding within a specific angle range.Thirdly,this paper proposes using the inverse optimization strategy to remove redundancy points from the initial path,making the path shorter and more accurate.Finally,to satisfy the smooth operation of the robot in practice,auxiliary points are used to optimize the cubic Bezier curve to avoid path-crossing obstacles when using the Bezier curve optimization.The experimental results demonstrate that,compared to the traditional RRT algorithm,the proposed FS-RRT algorithm performs favorably against mainstream algorithms regarding running time,number of search iterations,and path length.Moreover,the improved algorithm also performs well in a narrow obstacle environment,and its effectiveness is further confirmed by experimental verification.展开更多
Dispersion fuels,knowned for their excellent safety performance,are widely used in advanced reactors,such as hightemperature gas-cooled reactors.Compared with deterministic methods,the Monte Carlo method has more adva...Dispersion fuels,knowned for their excellent safety performance,are widely used in advanced reactors,such as hightemperature gas-cooled reactors.Compared with deterministic methods,the Monte Carlo method has more advantages in the geometric modeling of stochastic media.The explicit modeling method has high computational accuracy and high computational cost.The chord length sampling(CLS)method can improve computational efficiency by sampling the chord length during neutron transport using the matrix chord length?s probability density function.This study shows that the excluded-volume effect in realistic stochastic media can introduce certain deviations into the CLS.A chord length correction approach is proposed to obtain the chord length correction factor by developing the Particle code based on equivalent transmission probability.Through numerical analysis against reference solutions from explicit modeling in the RMC code,it was demonstrated that CLS with the proposed correction method provides good accuracy for addressing the excludedvolume effect in realistic infinite stochastic media.展开更多
This paper is an extension of Hanif, Hamad and Shahbaz estimator [1] for two-phase sampling. The aim of this paper is to develop a regression type estimator with two auxiliary variables for two-phase sampling when we ...This paper is an extension of Hanif, Hamad and Shahbaz estimator [1] for two-phase sampling. The aim of this paper is to develop a regression type estimator with two auxiliary variables for two-phase sampling when we don’t have any type of information about auxiliary variables at population level. To avoid multi-collinearity, it is assumed that both auxiliary variables have minimum correlation. Mean square error and bias of proposed estimator in two-phase sampling is derived. Mean square error of proposed estimator shows an improvement over other well known estimators under the same case.展开更多
Wideband spectrum sensing with a high-speed analog-digital converter(ADC) presents a challenge for practical systems.The Nyquist folding receiver(NYFR) is a promising scheme for achieving cost-effective real-time spec...Wideband spectrum sensing with a high-speed analog-digital converter(ADC) presents a challenge for practical systems.The Nyquist folding receiver(NYFR) is a promising scheme for achieving cost-effective real-time spectrum sensing,which is subject to the complexity of processing the modulated outputs.In this case,a multipath NYFR architecture with a step-sampling rate for the different paths is proposed.The different numbers of digital channels for each path are designed based on the Chinese remainder theorem(CRT).Then,the detectable frequency range is divided into multiple frequency grids,and the Nyquist zone(NZ) of the input can be obtained by sensing these grids.Thus,high-precision parameter estimation is performed by utilizing the NYFR characteristics.Compared with the existing methods,the scheme proposed in this paper overcomes the challenge of NZ estimation,information damage,many computations,low accuracy,and high false alarm probability.Comparative simulation experiments verify the effectiveness of the proposed architecture in this paper.展开更多
基金National Institute of Arthritis and Musculoskeletal and Skin Diseases of the National Institutes of Health,No.U01AR067138.
文摘AIM To establish minimum clinically important difference(MCID) for measurements in an orthopaedic patient population with joint disorders.METHODS Adult patients aged 18 years and older seeking care for joint conditions at an orthopaedic clinic took the Patient-Reported Outcomes Measurement Information System Physical Function(PROMIS~? PF) computerized adaptive test(CAT), hip disability and osteoarthritis outcome score for joint reconstruction(HOOS JR), and the knee injury and osteoarthritis outcome score for joint reconstruction(KOOS JR) from February 2014 to April 2017. MCIDs were calculated using anchorbased and distribution-based methods. Patient reports of meaningful change in function since their first clinic encounter were used as an anchor.RESULTS There were 2226 patients who participated with a mean age of 61.16(SD = 12.84) years, 41.6% male, and 89.7% Caucasian. Mean change ranged from 7.29 to 8.41 for the PROMIS~? PF CAT, from 14.81 to 19.68 for the HOOS JR, and from 14.51 to 18.85 for the KOOS JR. ROC cut-offs ranged from 1.97-8.18 for the PF CAT, 6.33-43.36 for the HOOS JR, and 2.21-8.16 for the KOOS JR. Distribution-based methods estimated MCID values ranging from 2.45 to 21.55 for the PROMIS~? PF CAT; from 3.90 to 43.61 for the HOOS JR, and from 3.98 to 40.67 for the KOOS JR. The median MCID value in the range was similar to the mean change score for each measure and was 7.9 for the PF CAT, 18.0 for the HOOS JR, and 15.1 for the KOOS JR.CONCLUSION This is the first comprehensive study providing a wide range of MCIDs for the PROMIS? PF, HOOS JR, and KOOS JR in orthopaedic patients with joint ailments.
文摘In this study we have proposed a modified ratio type estimator for population variance of the study variable y under simple random sampling without replacement making use of coefficient of kurtosis and median of an auxiliary variable x. The estimator’s properties have been derived up to first order of Taylor’s series expansion. The efficiency conditions derived theoretically under which the proposed estimator performs better than existing estimators. Empirical studies have been done using real populations to demonstrate the performance of the developed estimator in comparison with the existing estimators. The proposed estimator as illustrated by the empirical studies performs better than the existing estimators under some specified conditions i.e. it has the smallest Mean Squared Error and the highest Percentage Relative Efficiency. The developed estimator therefore is suitable to be applied to situations in which the variable of interest has a positive correlation with the auxiliary variable.
文摘提出了一种基于最小二乘支持向量机的织物剪切性能预测模型,并且采用遗传算法进行最小二乘支持向量机的参数优化,将获得的样本进行归一化处理后,将其输入预测模型以得到预测结果.仿真结果表明,基于最小二乘支持向量机的预测模型比BP神经网络和线性回归方法具有更高的精度和范化能力.
Abstract:
A new method is proposed to predict the fabric shearing property with least square support vector machines ( LS-SVM ). The genetic algorithm is investigated to select the parameters of LS-SVM models as a means of improving the LS- SVM prediction. After normalizing the sampling data, the sampling data are inputted into the model to gain the prediction result. The simulation results show the prediction model gives better forecasting accuracy and generalization ability than BP neural network and linear regression method.
文摘This paper deals with Bayesian inference and prediction problems of the Burr type XII distribution based on progressive first failure censored data. We consider the Bayesian inference under a squared error loss function. We propose to apply Gibbs sampling procedure to draw Markov Chain Monte Carlo (MCMC) samples, and they have in turn, been used to compute the Bayes estimates with the help of importance sampling technique. We have performed a simulation study in order to compare the proposed Bayes estimators with the maximum likelihood estimators. We further consider two sample Bayes prediction to predicting future order statistics and upper record values from Burr type XII distribution based on progressive first failure censored data. The predictive densities are obtained and used to determine prediction intervals for unobserved order statistics and upper record values. A real life data set is used to illustrate the results derived.
文摘Adaptive cluster sampling (ACS) has been a very important tool in estimation of population parameters of rare and clustered population. The fundamental idea behind this sampling plan is to decide on an initial sample from a defined population and to keep on sampling within the vicinity of the units that satisfy the condition that at least one characteristic of interest exists in a unit selected in the initial sample. Despite being an important tool for sampling rare and clustered population, adaptive cluster sampling design is unable to control the final sample size when no prior knowledge of the population is available. Thus adaptive cluster sampling with data-driven stopping rule (ACS’) was proposed to control the final sample size when prior knowledge of population structure is not available. This study examined the behavior of the HT, and HH estimator under the ACS design and ACS’ design using artificial population that is designed to have all the characteristics of a rare and clustered population. The efficiencies of the HT and HH estimator were used to determine the most efficient design in estimation of population mean in rare and clustered population. Results of both the simulated data and the real data show that the adaptive cluster sampling with stopping rule is more efficient for estimation of rare and clustered population than ordinary adaptive cluster sampling.
基金the Science,Research and Innovation Promotion Funding(TSRI)(Grant No.FRB660012/0168)managed under Rajamangala University of Technology Thanyaburi(FRB66E0646O.4).
文摘This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of the product based on the time truncated life test employing theWeibull distribution.The control chart developed supports the examination of the mean lifespan variation for a particular product in the process of manufacturing.Three control limit levels are used:the warning control limit,inner control limit,and outer control limit.Together,they enhance the capability for variation detection.A genetic algorithm can be used for optimization during the in-control process,whereby the optimal parameters can be established for the proposed control chart.The control chart performance is assessed using the average run length,while the influence of the model parameters upon the control chart solution is assessed via sensitivity analysis based on an orthogonal experimental design withmultiple linear regression.A comparative study was conducted based on the out-of-control average run length,in which the developed control chart offered greater sensitivity in the detection of process shifts while making use of smaller samples on average than is the case for existing control charts.Finally,to exhibit the utility of the developed control chart,this paper presents its application using simulated data with parameters drawn from the real set of data.
文摘This study analyzes the sample influx (samples per case file) into forensic science laboratory (FSL) and the corresponding analysis costs and uses arbitrary re-sampling plans to establish the minimum cost function. The demand for forensic analysis increased for all disciplines, especially biology/DNA between 2014 and 2015. While the average distribution of case files was about 42.5%, 40.6% and 17% for the three disciplines, the distribution of samples was rather different being 12%, 82.5% and 5.5% for samples requiring forensic biology, chemistry and toxicology analysis, respectively. Results show that most of the analysis workload was on forensic chemistry analysis. The cost of analysis for case files and the corresponding sample influx varied in the ratio of 35:6:1 and 28:12:1 for forensic chemistry, biology/DNA and toxicology for year 2014 for 2015, respectively. In the two consecutive years, the cost for forensic chemistry analysis was comparatively very high, necessitating re-sampling. The time series of sample influx in all disciplines are strongly stochastic, with higher magnitude for chemistry, biology/DNA and toxicology, in this order. The PDFs of sample influx data are highly skewed to the right, especially forensic toxicology and biology/DNA with peaks at 1 and 3 samples per case file. The arbitrary re-sampling plans were best suited to forensic chemistry case files (where re-sampling conditions apply). The locus of arbitrary number of samples to take from the submitted forensic samples was used to establish the minimum and scientifically acceptable samples by applying minimization function developed in this paper. The cost minimization function was also developed based on the average cost per sample and choice of re-sampling plans depending on the range of sample influx, from which the savings were determined and maximized. Thus, the study gives a forensic scientist a business model and scientific decision making tool on minimum number of samples to analyze focusing on savings on analysis cost.
基金supported by the National Science Foundation(Grant No.DMS-1440415)partially supported by a grant from the Simons Foundation,NSF Grants DMS-1720171 and DMS-2110895a Discovery Grant from Natural Sciences and Engineering Research Council of Canada.
文摘We propose a new framework for the sampling,compression,and analysis of distributions of point sets and other geometric objects embedded in Euclidean spaces.Our approach involves constructing a tensor called the RaySense sketch,which captures nearest neighbors from the underlying geometry of points along a set of rays.We explore various operations that can be performed on the RaySense sketch,leading to different properties and potential applications.Statistical information about the data set can be extracted from the sketch,independent of the ray set.Line integrals on point sets can be efficiently computed using the sketch.We also present several examples illustrating applications of the proposed strategy in practical scenarios.
文摘The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenzhou City,Southeast China.Two types of landslides samples,combined with seven non-landslide sampling strategies,resulted in a total of 14 scenarios.The corresponding landslide susceptibility map(LSM)for each scenario was generated using the random forest model.The receiver operating characteristic(ROC)curve and statistical indicators were calculated and used to assess the impact of the dataset sampling strategy.The results showed that higher accuracies were achieved when using the landslide core as positive samples,combined with non-landslide sampling from the very low zone or buffer zone.The results reveal the influence of landslide and non-landslide sampling strategies on the accuracy of LSA,which provides a reference for subsequent researchers aiming to obtain a more reasonable LSM.
文摘There exists a great variety of posturographic parameters which complicates the evaluation of center of pressure (COP) data. Hence, recommendations were given to use a set of complementary parameters to explain most of the variance. However, it is unknown whether a dual task paradigm leads to different parametrization sets. On account of this problem an exploratory factor analysis approach was conducted in a dual task experiment. 16 healthy subjects stood on a force plate performing a posture-cognition dual task (DT, focus of attention on a secondary task) with respect to different sampling durations. The subjects were not aware of being measured in contrast to a baseline task condition (BT, internal focus of attention) in the previously published part I. In compareson to BT a different factor loading pattern appears. In addition, factor loadings are strongly affected by different sampling durations. DT reveals a change of factor loading structure with longer sampling durations compared to BT. Specific recommendations concerning a framework of posturographic parametrization are given.
基金The Special Scientific Research Funds for Central Non-profit Institutes(East China Sea Fisheries Research Institute)under contract No.2016T08the National Natural Science Foundation of China under contract No.31400410
文摘It is a challenge in the field sampling to face conflict between the statistical requirements and the logistical constraints when explicitly estimating the macrobenthos species richness in the heterogeneous intertidal wetlands. To solve this problem, this study tried to design an optimal, efficient and practical sampling strategy by comprehensively focusing on the three main parts of the entire process(to optimize the sampling method, to determine the minimum sampling effort and to explore the proper sampling interval) in a typical intertidal wetland of the Changjiang(Yangtze) Estuary, China. Transect sampling was selected and optimized by stratification based on pronounced habitat types(tidal flat, tidal creek, salt marsh vegetation). This type of sampling is also termed within-transect stratification sampling. The optimal sampling intervals and the minimum sample effort were determined by two beneficial numerical methods: Monte Carlo simulations and accumulative species curves. The results show that the within-transect stratification sampling with typical habitat types was effective for encompassing 81% of the species, suggesting that this type of sampling design can largely reduce the sampling effort and labor. The optimal sampling intervals and minimum sampling efforts for three habitats were determined: sampling effort must exceed 1.8 m^2 by 10 m intervals in the salt marsh vegetation, 2 m^2 by 10 m intervals in the tidal flat, and 3 m^2 by 1 m intervals in the tidal creek habitat. It was suggested that the differences were influenced by the mobility range of the dominant species and the habitats' physical differences(e.g., tidal water, substrate, vegetation cover). The optimized sampling strategy could provide good precision in the richness estimation of macrobenthos and balance the sampling effort. Moreover, the conclusions presented here provide a reference for recommendations to consider before macrobenthic surveys take place in estuarine wetlands. The sampling strategy, focusing on the three key parts of the sampling design, had a good operational effect and could be used as a guide for field sampling for habitat management or ecosystem assessment.
文摘AIM: To investigate the Chinese version of the Low Vision Quality of Life Questionnaire(CLVQOL) as an instrument for obtaining clinically important changes after cataract surgery.METHODS: Patients underwent cataract surgery in Shanghai General Hospital, Shanghai Jiao Tong University, who fit the inclusion criteria were recruited. Two CLVQOLs were administered, including a preoperative CLVQOL and a CLVQOL at the end of the 3 mo follow-up period, and were completed using face-to-face interviews or phone interviews conducted by trained investigators. The minimal clinically important difference(MCID) was calculated using an anchor-based method and a distribution method. In addition, the responsiveness of the questionnaire was measured.RESULTS: A total of 155 residents were enrolled. The average visual acuity(VA) preoperatively was 0.08(SD=0.05), and it increased to 0.47(SD=0.28) at the end of followup. Statistically significant positive changes in the CLVQOL scores indicated significant improvement of vision related quality of life after cataract surgery. With the larger value between the two results as the final value, the MCID values of the CLVQOL(scores of the four scales as well as the total score) were 8.94, 2.61, 4.34, 3.10 and 17.63, respectively. The CLVQOL has both good internal and external responsiveness.CONCLUSION: CLVQOL scores are appropriate instruments for obtaining clinically important changes after cataract surgery. This study is an effective exploration for establishingcataract surgery efficacy standards, which helps clinical and scientific research workers in ophthalmology to gain a more in-depth understanding when using CLVQOL.
基金provided by Shaanxi Province’s Key Research and Development Plan(No.2022NY-087).
文摘For the problem of slow search and tortuous paths in the Rapidly Exploring Random Tree(RRT)algorithm,a feedback-biased sampling RRT,called FS-RRT,is proposedbasedon RRT.Firstly,toimprove the samplingefficiency of RRT to shorten the search time,the search area of the randomtree is restricted to improve the sampling efficiency.Secondly,to obtain better information about obstacles to shorten the path length,a feedback-biased sampling strategy is used instead of the traditional random sampling,the collision of the expanding node with an obstacle generates feedback information so that the next expanding node avoids expanding within a specific angle range.Thirdly,this paper proposes using the inverse optimization strategy to remove redundancy points from the initial path,making the path shorter and more accurate.Finally,to satisfy the smooth operation of the robot in practice,auxiliary points are used to optimize the cubic Bezier curve to avoid path-crossing obstacles when using the Bezier curve optimization.The experimental results demonstrate that,compared to the traditional RRT algorithm,the proposed FS-RRT algorithm performs favorably against mainstream algorithms regarding running time,number of search iterations,and path length.Moreover,the improved algorithm also performs well in a narrow obstacle environment,and its effectiveness is further confirmed by experimental verification.
文摘Dispersion fuels,knowned for their excellent safety performance,are widely used in advanced reactors,such as hightemperature gas-cooled reactors.Compared with deterministic methods,the Monte Carlo method has more advantages in the geometric modeling of stochastic media.The explicit modeling method has high computational accuracy and high computational cost.The chord length sampling(CLS)method can improve computational efficiency by sampling the chord length during neutron transport using the matrix chord length?s probability density function.This study shows that the excluded-volume effect in realistic stochastic media can introduce certain deviations into the CLS.A chord length correction approach is proposed to obtain the chord length correction factor by developing the Particle code based on equivalent transmission probability.Through numerical analysis against reference solutions from explicit modeling in the RMC code,it was demonstrated that CLS with the proposed correction method provides good accuracy for addressing the excludedvolume effect in realistic infinite stochastic media.
文摘This paper is an extension of Hanif, Hamad and Shahbaz estimator [1] for two-phase sampling. The aim of this paper is to develop a regression type estimator with two auxiliary variables for two-phase sampling when we don’t have any type of information about auxiliary variables at population level. To avoid multi-collinearity, it is assumed that both auxiliary variables have minimum correlation. Mean square error and bias of proposed estimator in two-phase sampling is derived. Mean square error of proposed estimator shows an improvement over other well known estimators under the same case.
基金supported by the Key Projects of the 2022 National Defense Science and Technology Foundation Strengthening Plan 173 (Grant No.2022-173ZD-010)the Equipment PreResearch Foundation of The State Key Laboratory (Grant No.6142101200204)。
文摘Wideband spectrum sensing with a high-speed analog-digital converter(ADC) presents a challenge for practical systems.The Nyquist folding receiver(NYFR) is a promising scheme for achieving cost-effective real-time spectrum sensing,which is subject to the complexity of processing the modulated outputs.In this case,a multipath NYFR architecture with a step-sampling rate for the different paths is proposed.The different numbers of digital channels for each path are designed based on the Chinese remainder theorem(CRT).Then,the detectable frequency range is divided into multiple frequency grids,and the Nyquist zone(NZ) of the input can be obtained by sensing these grids.Thus,high-precision parameter estimation is performed by utilizing the NYFR characteristics.Compared with the existing methods,the scheme proposed in this paper overcomes the challenge of NZ estimation,information damage,many computations,low accuracy,and high false alarm probability.Comparative simulation experiments verify the effectiveness of the proposed architecture in this paper.