In the survey of fishery resources,the sampling design will directly impact the accuracy of the estimation of the abundance.Therefore,it is necessary to optimize the sampling design to increase the quality of fishery ...In the survey of fishery resources,the sampling design will directly impact the accuracy of the estimation of the abundance.Therefore,it is necessary to optimize the sampling design to increase the quality of fishery surveys.The distribution and abundance of fisheries resource estimated based on the bottom trawl survey data in the Changjiang River(Yangtze River)Estuary-Hangzhou Bay and its adjacent waters in 2007 were used to simulate the"true"situation.Then the abundance index of Portunus trituberculatus were calculated and compared with its true index to evaluate the impacts of different sampling designs on the abundance estimation.Four sampling methods(including fixed-station sampling,simple random sampling,stratified fixed-station sampling,and stratified random sampling)were simulated.Three numbers of stations(9,16 and 24)were assumed for the scenarios of fixed-station sampling and simple random sampling without stratification.While 16 stations were assumed for the scenarios with stratification.Three reaction distances(1.5 m,3 m and 5 m)of P.trituberculatus to the bottom line of trawl were also assumed to adapt to the movement ability of the P.trituberculatus for different ages,seasons and substrate conditions.Generally speaking,compared with unstratified sampling design,the stratified sampling design resulted in more accurate abundance estimation of P.trituberculatus,and simple random sampling design is better than fixed-station sampling design.The accuracy of the simulated results was improved with the increase of the station number.The maximum relative estimation error(REE)was 163.43%and the minimum was 49.40%for the fixed-station sampling scenario with 9 stations,while 38.62%and 4.15%for 24 stations.With the increase of reaction distance,the relative absolute bias(RAB)and REE gradually decreased.Resource-intensive area and the seasons with high density variances have significant impacts on simulation results.Thus,it will be helpful if there are prior information or pre-survey results about density distribution.The current study can provide reference for the future sampling design of bottom trawl of P.trituberculatus and other species.展开更多
Fishery-independent surveys can provide high-quality data and support fishery assessment and management.Optimization of sampling design is crucial to increase the quality of fishery surveys.Crab pots are important fis...Fishery-independent surveys can provide high-quality data and support fishery assessment and management.Optimization of sampling design is crucial to increase the quality of fishery surveys.Crab pots are important fishing gears used to catch crabs.We analyzed the impacts of sampling design of crab pots on the abundance of Portunus trituberculatus in the Changjiang(Yangtze)River estuary to the Hangzhou Bay and its adjacent waters in East China Sea.The crab pots were cylindrical,240 mm in height and 600 mm in diameter of the iron ring.Our sampling designs(including fixed-station sampling,simple random sampling,stratified fixed-station sampling,and stratified random sampling),three number of stations(9,16,and 24),and three numbers of crab pots(500,1000,and 3000)were simulated and compared with the“true”abundance that obtained from bottom trawl surveys in the study area in 2007.The scenarios with 16 stations were set in stratification as a control group for comparison with unstratified designs.Results show that simple random sampling can obtain more stable results than fixed-station sampling in the abundance estimation of P.trituberculatus.In addition,stratified sampling resulted in more accurate abundance than unstratified sampling.The accuracy of the simulated results improved with the increase of the number of stations.No remarkable differences in the results were found among the scenarios of different number of crab pots at each station.However,resource-intensive areas exerted great impacts on simulation results.Thus,prior information or pre-survey results about resource abundance and density distribution are necessary.This study may serve as a reference for future sampling designs of crab pots of P.trituberculatus and other species.展开更多
Fixed-station sampling design was widely used in fishery-independent surveys because of its characteristics of convenient sampling station setting,but the non-probabilistic(fixed)nature made it more uncertainty of dra...Fixed-station sampling design was widely used in fishery-independent surveys because of its characteristics of convenient sampling station setting,but the non-probabilistic(fixed)nature made it more uncertainty of drawing inferences on population.The performance of fixed-station sampling design for multispecies survey has not been evaluated,and we are uncertain if the design could detect the temporal trends of different populations in multispecies fishery-independent survey.In this study,spatial distribution of abundance indices for three species with different spatial distribution patterns including small yellow croaker(Larimichthys polyactis),whitespotted conger(Conger myriaster)and Fang’s blenny(Enedrias fangi)were simulated using ordinary kriging interpolation as the“true”population distribution.The performance of fixed-station sampling design was compared with simple random sampling design by resampling the simulated“true”populations in this simulation study.The results showed that the fixed-station sampling design had the power to detect the seasonal trends of species abundance.The effectiveness of fixed-station sampling design were different in different species distribution patterns.When the species had even distribution,fixed-station sampling design could get high quality abundance data;when the distribution was uneven with heterogeneity or patchiness,fixed-station sampling design tended to underestimate or overestimate the abundance.Evidently,the estimates of abundance index based on the fixedstation sampling design must be calibrated cautiously while applying them for fisheries stock assessment and management.This study suggested that fixed-station sampling design could catch the temporal dynamics of population abundance,but the abundance estimates from the fixed-station sampling design could not be treated as the absolute estimates of populations.展开更多
In this paper,a joint analysis consisting of goodness-of-fit tests and Markov chain Monte Carlo simulations are used to assess the performance of some ranked set sampling designs.The Markov chain Monte Carlo simulatio...In this paper,a joint analysis consisting of goodness-of-fit tests and Markov chain Monte Carlo simulations are used to assess the performance of some ranked set sampling designs.The Markov chain Monte Carlo simulations are conducted when Bayesian methods with Jeffery’s priors of the unknown parameters of Weibull distribution are used,while the goodness of fit analysis is conducted when the likelihood estimators are used and the corresponding empirical distributions are obtained.The ranked set sampling designs considered in this research are the usual ranked set sampling,extreme ranked set sampling,median ranked set sampling,and neoteric ranked set sampling designs.An intensive Monte Carlo simulation study is conducted using Lindley’s approximation algorithm to compute the different designs’-based estimators.The study showed that the dependent design“neoteric ranked set sampling design”is superior to other ranked set designs and the total relative efficiency is higher than the other designs’total relative efficiency.展开更多
Sampling design(SD) plays a crucial role in providing reliable input for digital soil mapping(DSM) and increasing its efficiency.Sampling design, with a predetermined sample size and consideration of budget and spatia...Sampling design(SD) plays a crucial role in providing reliable input for digital soil mapping(DSM) and increasing its efficiency.Sampling design, with a predetermined sample size and consideration of budget and spatial variability, is a selection procedure for identifying a set of sample locations spread over a geographical space or with a good feature space coverage. A good feature space coverage ensures accurate estimation of regression parameters, while spatial coverage contributes to effective spatial interpolation.First, we review several statistical and geometric SDs that mainly optimize the sampling pattern in a geographical space and illustrate the strengths and weaknesses of these SDs by considering spatial coverage, simplicity, accuracy, and efficiency. Furthermore, Latin hypercube sampling, which obtains a full representation of multivariate distribution in geographical space, is described in detail for its development, improvement, and application. In addition, we discuss the fuzzy k-means sampling, response surface sampling, and Kennard-Stone sampling, which optimize sampling patterns in a feature space. We then discuss some practical applications that are mainly addressed by the conditioned Latin hypercube sampling with the flexibility and feasibility of adding multiple optimization criteria. We also discuss different methods of validation, an important stage of DSM, and conclude that an independent dataset selected from the probability sampling is superior for its free model assumptions. For future work, we recommend: 1) exploring SDs with both good spatial coverage and feature space coverage; 2) uncovering the real impacts of an SD on the integral DSM procedure;and 3) testing the feasibility and contribution of SDs in three-dimensional(3 D) DSM with variability for multiple layers.展开更多
Non-agricultural lands are surveyed sparsely in general.Meanwhile,soils in these areas usually exhibit strong spatial variability which requires more samples for producing acceptable estimates.Capulin Volcano National...Non-agricultural lands are surveyed sparsely in general.Meanwhile,soils in these areas usually exhibit strong spatial variability which requires more samples for producing acceptable estimates.Capulin Volcano National Monument,as a typical sparsely-surveyed area,was chosen to assess spatial variability of a variety of soil properties,and furthermore,to investigate its implications for sampling design.One hundred and forty one composited soil samples were collected across the Monument and the surrounding areas.Soil properties including pH,organic matter content,extractable elements such as calcium (Ca),magnesium (Mg),potassium (K),sodium (Na),phosphorus (P),sulfur (S),zinc (Zn),and copper (Cu),as well as sand,silt,and clay percentages were analyzed for each sample.Semivariograms of all properties were constructed,standardized,and compared to estimate the spatial variability of the soil properties in the area.Based on the similarity among standardized semivariograms,we found that the semivariograms could be generalized for physical and chemical properties,respectively.The generalized semivariogram for physical properties had a much greater sill value (2.635) and effective range (7 500 m) than that for chemical properties.Optimal sampling density (OSD),which is derived from the generalized semivariogram and defines the relationship between sampling density and expected error percentage,was proposed to represent,interpret,and compare soil spatial variability and to provide guidance for sample scheme design.OSDs showed that chemical properties exhibit a stronger local spatial variability than soil texture parameters,implying more samples or analysis are required to achieve a similar level of precision.展开更多
The world needs around 150 Pg of negative carbon emissions to mitigate climate change. Global soils may provide a stable, sizeable reservoir to help achieve this goal by sequestering atmospheric carbon dioxide as soil...The world needs around 150 Pg of negative carbon emissions to mitigate climate change. Global soils may provide a stable, sizeable reservoir to help achieve this goal by sequestering atmospheric carbon dioxide as soil organic carbon (SOC). In turn, SOC can support healthy soils and provide a multitude of ecosystem benefits. To support SOC sequestration, researchers and policy makers must be able to precisely measure the amount of SOC in a given plot of land. SOC measurement is typically accomplished by taking soil cores selected at random from the plot under study, mixing (compositing) some of them together, and analyzing (assaying) the composited samples in a laboratory. Compositing reduces assay costs, which can be substantial. Taking samples is also costly. Given uncertainties and costs in both sampling and assay along with a desired estimation precision, there is an optimal composite size that will minimize the budget required to achieve that precision. Conversely, given a fixed budget, there is a composite size that minimizes uncertainty. In this paper, we describe and formalize sampling and assay for SOC and derive the optima for three commonly used assay methods: dry combustion in an elemental analyzer, loss-on-ignition, and mid-infrared spectroscopy. We demonstrate the utility of this approach using data from a soil survey conducted in California. We give recommendations for practice and provide software to implement our framework.展开更多
The sample preparation of samples conlaining bovine serum albumin(BSA),e.g..as used in transdermal Franz diffusion cell(FDC) solutions,was evaluated using an analytical qualily-by-design(QbD)approach.Traditional...The sample preparation of samples conlaining bovine serum albumin(BSA),e.g..as used in transdermal Franz diffusion cell(FDC) solutions,was evaluated using an analytical qualily-by-design(QbD)approach.Traditional precipitation of BSA by adding an equal volume of organic solvent,often successfully used with conventional HPLC-PDA,was found insufficiently robust when novel fused-core HPLC and/or UPLC-MS methods were used.In this study,three factors(acetonitrile(%).formic acid(%) and boiling time(min)) were included in the experimental design to determine an optimal and more suitable sample treatment of BSAcontaining FDC solutions.Using a QbD and Derringer desirability(D) approach,combining BSA loss,dilution factor and variability,we constructed an optimal working space with the edge of failure defined as D〈0.9.The design space is modelled and is confirmed to have an ACN range of 83 ± 3% and FA content of 1 ±0.25%.展开更多
In digital soil mapping(DSM),a fundamental assumption is that the spatial variability of the target variable can be explained by the predictors or environmental covariates.Strategies to adequately sample the predictor...In digital soil mapping(DSM),a fundamental assumption is that the spatial variability of the target variable can be explained by the predictors or environmental covariates.Strategies to adequately sample the predictors have been well documented,with the conditioned Latin hypercube sampling(cLHS)algorithm receiving the most attention in the DSM community.Despite advances in sampling design,a critical gap remains in determining the number of samples required for DSM projects.We propose a simple workflow and function coded in R language to determine the minimum sample size for the cLHS algorithm based on histograms of the predictor variables using the Freedman-Diaconis rule for determining optimal bin width.Data preprocessing was included to correct for multimodal and non-normally distributed data,as these can affect sample size determination from the histogram.Based on a user-selected quantile range(QR)for the sample plan,the densities of the histogram bins at the upper and lower bounds of the QR were used as a scaling factor to determine minimum sample size.This technique was applied to a field-scale set of environmental covariates for a well-sampled agricultural study site near Guelph,Ontario,Canada,and tested across a range of QRs.The results showed increasing minimum sample size with an increase in the QR selected.Minimum sample size increased from 44 to 83 when the QR increased from 50% to 95% and then increased exponentially to 194 for the 99%QR.This technique provides an estimate of minimum sample size that can be used as an input to the cLHS algorithm.展开更多
Species-area relationships(SARs),also known as species-area curves,are fundamental scaling tools for biodiversity research.Sampling design and taxonomic groups affect the widely cited forms of species-area curves.Howe...Species-area relationships(SARs),also known as species-area curves,are fundamental scaling tools for biodiversity research.Sampling design and taxonomic groups affect the widely cited forms of species-area curves.However,the influence of sampling design and related environmental heterogeneity on SAR curves is rarely considered.Here,we investigated the SAR among different plant life forms(herbaceous plants,shrubs,and trees)in a 25.2-ha ForestGEO plot,the Wanglang Plot,in Sichuan,southwestern China,using a non-contiguous quadrat sampling method and power-law model.We compared the estimated parameters(the intercept c and the slope z)of the power-law models among different plant life forms,tested whether the SAR curve forms varied with sampling starting location,and assessed the effect of environmental heterogeneity accumulating with sampling area on curve variation.We found a wider range of variations in the SARs.The estimated c,z-values of power SAR were higher for the herbaceous plants than for the woody plants.A wider variation of SARs for the herbaceous plants than those for the woody plants.The selection of sampling starting location affected the SAR curve forms because of the roles of soil and topographic heterogeneity.We concluded that environmental heterogeneity regulates SAR curves sampled from different starting locations through spatial distribution of plant life forms.Thus,we recommend considering the design of sampling starting location when constructing SAR curves,especially in a heterogeneous habitat with unrandom distribution patterns of species.展开更多
In July of 1987, the Sampling Survey of Children's Situation was conducted in 9 provincesautonomous regions of China. A stratified two--stage cluster sampling plan was designed for thesurvey. The paper presents th...In July of 1987, the Sampling Survey of Children's Situation was conducted in 9 provincesautonomous regions of China. A stratified two--stage cluster sampling plan was designed for thesurvey. The paper presents the methods of stratification, selecting n=2 PSU's (cities/counties) withunequal probabilities without replacement in each stratum and selecting residents/village committeein each sampled city/county. All formulae of estimating population characteristics (especiallypopulation totals and the ratios of two totals), and estimating variances of those estimators aregiven. Finally, we analyse the precision of the survey preliminarily from the result of dataprocessing.展开更多
Monodisperse aerosols are essential in many applications, such as filter testing, aerosol instrument calibration, and experiments for validating models. This paper describes the design principle, construction, and per...Monodisperse aerosols are essential in many applications, such as filter testing, aerosol instrument calibration, and experiments for validating models. This paper describes the design principle, construction, and performance of a monodisperse-aerosol generation system that comprises an atomizer, virtual impactor, microcontroller-based isokinetic probe, wind tunnel, and velocity measurement device. The size distribution of the produced monodisperse aerosols was determined by an optical particle counter. The effects of atomizer characteristics, the rates of minor and major flow, and solution criteria were investigated. It was found that all these parameters affect the generation of monodisperse aerosol. Finally, the expected geometric standard deviation (〈1.25) of monodisperse aerosol particles was obtained with the most suitable atomizer for 10% oleic acid in ethyl alcohol solution with 5%-15% minor flow, where the ratio between the nozzle-to-probe distance and acceleration-nozzle-exit diameter was 0.66. The con- structed monodisperse-aerosol-generation system can be used for instrumental calibration and aerosol research.展开更多
The rational design of the sample cell may improve the sensitivity of surface-enhanced Raman scattering (SERS) detection in a high degree. Finite difference time domain (FDTD) simulations of the configuration of A...The rational design of the sample cell may improve the sensitivity of surface-enhanced Raman scattering (SERS) detection in a high degree. Finite difference time domain (FDTD) simulations of the configuration of Ag film-Ag particles illuminated by plane wave and evanescent wave are performed to provide physical insight for design of the sample cell. Numerical solutions indicate that the sample cell can provide more "hot spots" and the massive field intensity enhancement occurs in these "hot spots". More information on the nanometer character of the sample can be got because of gradient-field Raman (GFR) of evanescent wave. OCIS codes: 290.5860, 240.0310, 240.6680, 999.9999 (surface-enhanced Raman scattering).展开更多
Adaptive cluster sampling (ACS) has been widely used for data collection of environment and natural resources. However, the randomness of its final sample size often impedes the use of this method. To control the fi...Adaptive cluster sampling (ACS) has been widely used for data collection of environment and natural resources. However, the randomness of its final sample size often impedes the use of this method. To control the final sample sizes, in this study, a k-step ACS based on Horvitz-Thompson (HT) estimator was developed and an unbiased estimator was derived. The k-step ACS-HT was assessed first using a simulated example and then using a real survey for numbers of plants for three species that were characterized by clustered and patchily spatial distributions. The effectiveness of this sampling design method was assessed in comparison with ACS Hansen-Hurwitz (ACS-HH) and ACS- HT estimators, and k-step ACS-HT estimator. The effectiveness of using different k- step sizes was also compared. The results showed that k-step ACS^HT estimator was most effective and ACS-HH was the least. Moreover, stable sample mean and variance estimates could be obtained after a certain number of steps, but depending on plant species, k-step ACS without replacement was slightly more effective than that with replacement. In k-step ACS, the variance estimate of one-step ACS is much larger than other k-step ACS (k 〉 1), but it is smaller than ACS. This implies that k-step ACS is more effective than traditional ACS, besides, the final sample size can be controlled easily in population with big clusters.展开更多
Recent noteworthy developments in the field of two-dimensional(2D) correlation spectroscopy are reviewed.2D correlation spectroscopy has become a very popular tool due to its versatility and relative ease of use.The...Recent noteworthy developments in the field of two-dimensional(2D) correlation spectroscopy are reviewed.2D correlation spectroscopy has become a very popular tool due to its versatility and relative ease of use.The technique utilizes a spectroscopic or other analytical probe from a number of selections for a broad range of sample systems by employing different types of external perturbations to induce systematic variations in intensities of spectra.Such spectral intensity variations are then converted into2 D spectra by a form of correlation analysis for subsequent interpretation.Many different types of 2D correlation approaches have been proposed.In particular,2D hetero-correlation and multiple perturbation correlation analyses,including orthogonal sample design scheme,are discussed in this review.Additional references to other important developments in the field of 2D correlation spectroscopy,such as projection correlation and codistribution analysis,were also provided.展开更多
In the interception engagement,if the target movement information is not accurate enough for the mid-course guidance of intercepting missiles,the interception mission may fail as a result of large handover errors.This...In the interception engagement,if the target movement information is not accurate enough for the mid-course guidance of intercepting missiles,the interception mission may fail as a result of large handover errors.This paper proposes a novel cooperative mid-course guidance scheme for multiple missiles to intercept a target under the condition of large detection errors.Under this scheme,the launch and interception moments are staggered for different missiles.The earlier launched missiles can obtain a relatively accurate detection to the target during their terminal guidance,based on which the latter missiles are permitted to eliminate the handover error in the mid-course guidance.A significant merit of this scheme is that the available resources are fully exploited and less missiles are needed to achieve the interception mission.To this end,first,the design of cooperative handover parameters is formulated as an optimization problem.Then,an algorithm based on Monte Carlo sampling and stochastic approximation is proposed to solve this optimization problem,and the convergence of the algorithm is proved as well.Finally,simulation experiments are carried out to validate the effectiveness of the proposed cooperative scheme and algorithm.展开更多
基金The National Key Research and Development Program of China under contract No.2017YFA0604902the Science and Technology Project of Zhoushan under contract No.2017C41012。
文摘In the survey of fishery resources,the sampling design will directly impact the accuracy of the estimation of the abundance.Therefore,it is necessary to optimize the sampling design to increase the quality of fishery surveys.The distribution and abundance of fisheries resource estimated based on the bottom trawl survey data in the Changjiang River(Yangtze River)Estuary-Hangzhou Bay and its adjacent waters in 2007 were used to simulate the"true"situation.Then the abundance index of Portunus trituberculatus were calculated and compared with its true index to evaluate the impacts of different sampling designs on the abundance estimation.Four sampling methods(including fixed-station sampling,simple random sampling,stratified fixed-station sampling,and stratified random sampling)were simulated.Three numbers of stations(9,16 and 24)were assumed for the scenarios of fixed-station sampling and simple random sampling without stratification.While 16 stations were assumed for the scenarios with stratification.Three reaction distances(1.5 m,3 m and 5 m)of P.trituberculatus to the bottom line of trawl were also assumed to adapt to the movement ability of the P.trituberculatus for different ages,seasons and substrate conditions.Generally speaking,compared with unstratified sampling design,the stratified sampling design resulted in more accurate abundance estimation of P.trituberculatus,and simple random sampling design is better than fixed-station sampling design.The accuracy of the simulated results was improved with the increase of the station number.The maximum relative estimation error(REE)was 163.43%and the minimum was 49.40%for the fixed-station sampling scenario with 9 stations,while 38.62%and 4.15%for 24 stations.With the increase of reaction distance,the relative absolute bias(RAB)and REE gradually decreased.Resource-intensive area and the seasons with high density variances have significant impacts on simulation results.Thus,it will be helpful if there are prior information or pre-survey results about density distribution.The current study can provide reference for the future sampling design of bottom trawl of P.trituberculatus and other species.
基金Supported by the National Key Research and Development Program of China(No.2019YFD0901304)the Science and Technology Project of Zhoushan(No.2017C41012)。
文摘Fishery-independent surveys can provide high-quality data and support fishery assessment and management.Optimization of sampling design is crucial to increase the quality of fishery surveys.Crab pots are important fishing gears used to catch crabs.We analyzed the impacts of sampling design of crab pots on the abundance of Portunus trituberculatus in the Changjiang(Yangtze)River estuary to the Hangzhou Bay and its adjacent waters in East China Sea.The crab pots were cylindrical,240 mm in height and 600 mm in diameter of the iron ring.Our sampling designs(including fixed-station sampling,simple random sampling,stratified fixed-station sampling,and stratified random sampling),three number of stations(9,16,and 24),and three numbers of crab pots(500,1000,and 3000)were simulated and compared with the“true”abundance that obtained from bottom trawl surveys in the study area in 2007.The scenarios with 16 stations were set in stratification as a control group for comparison with unstratified designs.Results show that simple random sampling can obtain more stable results than fixed-station sampling in the abundance estimation of P.trituberculatus.In addition,stratified sampling resulted in more accurate abundance than unstratified sampling.The accuracy of the simulated results improved with the increase of the number of stations.No remarkable differences in the results were found among the scenarios of different number of crab pots at each station.However,resource-intensive areas exerted great impacts on simulation results.Thus,prior information or pre-survey results about resource abundance and density distribution are necessary.This study may serve as a reference for future sampling designs of crab pots of P.trituberculatus and other species.
基金The Marine S&T Fund of Shandong Province for Pilot National Laboratory for Marine Science and Technology(Qingdao)under contract No.2018SDKJ0501-2the National Key Research and Development Program of China under contract No.2019YFD0901304。
文摘Fixed-station sampling design was widely used in fishery-independent surveys because of its characteristics of convenient sampling station setting,but the non-probabilistic(fixed)nature made it more uncertainty of drawing inferences on population.The performance of fixed-station sampling design for multispecies survey has not been evaluated,and we are uncertain if the design could detect the temporal trends of different populations in multispecies fishery-independent survey.In this study,spatial distribution of abundance indices for three species with different spatial distribution patterns including small yellow croaker(Larimichthys polyactis),whitespotted conger(Conger myriaster)and Fang’s blenny(Enedrias fangi)were simulated using ordinary kriging interpolation as the“true”population distribution.The performance of fixed-station sampling design was compared with simple random sampling design by resampling the simulated“true”populations in this simulation study.The results showed that the fixed-station sampling design had the power to detect the seasonal trends of species abundance.The effectiveness of fixed-station sampling design were different in different species distribution patterns.When the species had even distribution,fixed-station sampling design could get high quality abundance data;when the distribution was uneven with heterogeneity or patchiness,fixed-station sampling design tended to underestimate or overestimate the abundance.Evidently,the estimates of abundance index based on the fixedstation sampling design must be calibrated cautiously while applying them for fisheries stock assessment and management.This study suggested that fixed-station sampling design could catch the temporal dynamics of population abundance,but the abundance estimates from the fixed-station sampling design could not be treated as the absolute estimates of populations.
文摘In this paper,a joint analysis consisting of goodness-of-fit tests and Markov chain Monte Carlo simulations are used to assess the performance of some ranked set sampling designs.The Markov chain Monte Carlo simulations are conducted when Bayesian methods with Jeffery’s priors of the unknown parameters of Weibull distribution are used,while the goodness of fit analysis is conducted when the likelihood estimators are used and the corresponding empirical distributions are obtained.The ranked set sampling designs considered in this research are the usual ranked set sampling,extreme ranked set sampling,median ranked set sampling,and neoteric ranked set sampling designs.An intensive Monte Carlo simulation study is conducted using Lindley’s approximation algorithm to compute the different designs’-based estimators.The study showed that the dependent design“neoteric ranked set sampling design”is superior to other ranked set designs and the total relative efficiency is higher than the other designs’total relative efficiency.
基金funded by the Natural Science and Engineering Research Council (NSERC) of Canada (No. RGPIN-2014-04100)
文摘Sampling design(SD) plays a crucial role in providing reliable input for digital soil mapping(DSM) and increasing its efficiency.Sampling design, with a predetermined sample size and consideration of budget and spatial variability, is a selection procedure for identifying a set of sample locations spread over a geographical space or with a good feature space coverage. A good feature space coverage ensures accurate estimation of regression parameters, while spatial coverage contributes to effective spatial interpolation.First, we review several statistical and geometric SDs that mainly optimize the sampling pattern in a geographical space and illustrate the strengths and weaknesses of these SDs by considering spatial coverage, simplicity, accuracy, and efficiency. Furthermore, Latin hypercube sampling, which obtains a full representation of multivariate distribution in geographical space, is described in detail for its development, improvement, and application. In addition, we discuss the fuzzy k-means sampling, response surface sampling, and Kennard-Stone sampling, which optimize sampling patterns in a feature space. We then discuss some practical applications that are mainly addressed by the conditioned Latin hypercube sampling with the flexibility and feasibility of adding multiple optimization criteria. We also discuss different methods of validation, an important stage of DSM, and conclude that an independent dataset selected from the probability sampling is superior for its free model assumptions. For future work, we recommend: 1) exploring SDs with both good spatial coverage and feature space coverage; 2) uncovering the real impacts of an SD on the integral DSM procedure;and 3) testing the feasibility and contribution of SDs in three-dimensional(3 D) DSM with variability for multiple layers.
文摘Non-agricultural lands are surveyed sparsely in general.Meanwhile,soils in these areas usually exhibit strong spatial variability which requires more samples for producing acceptable estimates.Capulin Volcano National Monument,as a typical sparsely-surveyed area,was chosen to assess spatial variability of a variety of soil properties,and furthermore,to investigate its implications for sampling design.One hundred and forty one composited soil samples were collected across the Monument and the surrounding areas.Soil properties including pH,organic matter content,extractable elements such as calcium (Ca),magnesium (Mg),potassium (K),sodium (Na),phosphorus (P),sulfur (S),zinc (Zn),and copper (Cu),as well as sand,silt,and clay percentages were analyzed for each sample.Semivariograms of all properties were constructed,standardized,and compared to estimate the spatial variability of the soil properties in the area.Based on the similarity among standardized semivariograms,we found that the semivariograms could be generalized for physical and chemical properties,respectively.The generalized semivariogram for physical properties had a much greater sill value (2.635) and effective range (7 500 m) than that for chemical properties.Optimal sampling density (OSD),which is derived from the generalized semivariogram and defines the relationship between sampling density and expected error percentage,was proposed to represent,interpret,and compare soil spatial variability and to provide guidance for sample scheme design.OSDs showed that chemical properties exhibit a stronger local spatial variability than soil texture parameters,implying more samples or analysis are required to achieve a similar level of precision.
文摘The world needs around 150 Pg of negative carbon emissions to mitigate climate change. Global soils may provide a stable, sizeable reservoir to help achieve this goal by sequestering atmospheric carbon dioxide as soil organic carbon (SOC). In turn, SOC can support healthy soils and provide a multitude of ecosystem benefits. To support SOC sequestration, researchers and policy makers must be able to precisely measure the amount of SOC in a given plot of land. SOC measurement is typically accomplished by taking soil cores selected at random from the plot under study, mixing (compositing) some of them together, and analyzing (assaying) the composited samples in a laboratory. Compositing reduces assay costs, which can be substantial. Taking samples is also costly. Given uncertainties and costs in both sampling and assay along with a desired estimation precision, there is an optimal composite size that will minimize the budget required to achieve that precision. Conversely, given a fixed budget, there is a composite size that minimizes uncertainty. In this paper, we describe and formalize sampling and assay for SOC and derive the optima for three commonly used assay methods: dry combustion in an elemental analyzer, loss-on-ignition, and mid-infrared spectroscopy. We demonstrate the utility of this approach using data from a soil survey conducted in California. We give recommendations for practice and provide software to implement our framework.
基金the Special Research Fund of Ghent University(BOF 01D23812 to Lien Taevernier and BOF O1J22510 to Evelien Wynendaele and Professor Bart De Spiegeleer)the Institute for the Promotion of Innovation through Science and Technology in Flanders(IWT 101529 to Matthias D'Hondt)for their financial funding
文摘The sample preparation of samples conlaining bovine serum albumin(BSA),e.g..as used in transdermal Franz diffusion cell(FDC) solutions,was evaluated using an analytical qualily-by-design(QbD)approach.Traditional precipitation of BSA by adding an equal volume of organic solvent,often successfully used with conventional HPLC-PDA,was found insufficiently robust when novel fused-core HPLC and/or UPLC-MS methods were used.In this study,three factors(acetonitrile(%).formic acid(%) and boiling time(min)) were included in the experimental design to determine an optimal and more suitable sample treatment of BSAcontaining FDC solutions.Using a QbD and Derringer desirability(D) approach,combining BSA loss,dilution factor and variability,we constructed an optimal working space with the edge of failure defined as D〈0.9.The design space is modelled and is confirmed to have an ACN range of 83 ± 3% and FA content of 1 ±0.25%.
基金the Natural Science and Engineering Research Council(NSERC)of Canada,which supported and funded this project through an NSERC Postgraduate Scholarship—Doctoral(PGS-D)。
文摘In digital soil mapping(DSM),a fundamental assumption is that the spatial variability of the target variable can be explained by the predictors or environmental covariates.Strategies to adequately sample the predictors have been well documented,with the conditioned Latin hypercube sampling(cLHS)algorithm receiving the most attention in the DSM community.Despite advances in sampling design,a critical gap remains in determining the number of samples required for DSM projects.We propose a simple workflow and function coded in R language to determine the minimum sample size for the cLHS algorithm based on histograms of the predictor variables using the Freedman-Diaconis rule for determining optimal bin width.Data preprocessing was included to correct for multimodal and non-normally distributed data,as these can affect sample size determination from the histogram.Based on a user-selected quantile range(QR)for the sample plan,the densities of the histogram bins at the upper and lower bounds of the QR were used as a scaling factor to determine minimum sample size.This technique was applied to a field-scale set of environmental covariates for a well-sampled agricultural study site near Guelph,Ontario,Canada,and tested across a range of QRs.The results showed increasing minimum sample size with an increase in the QR selected.Minimum sample size increased from 44 to 83 when the QR increased from 50% to 95% and then increased exponentially to 194 for the 99%QR.This technique provides an estimate of minimum sample size that can be used as an input to the cLHS algorithm.
基金supported by the National Natural Science Foundation of China(Nos.31988102 and 31300450).
文摘Species-area relationships(SARs),also known as species-area curves,are fundamental scaling tools for biodiversity research.Sampling design and taxonomic groups affect the widely cited forms of species-area curves.However,the influence of sampling design and related environmental heterogeneity on SAR curves is rarely considered.Here,we investigated the SAR among different plant life forms(herbaceous plants,shrubs,and trees)in a 25.2-ha ForestGEO plot,the Wanglang Plot,in Sichuan,southwestern China,using a non-contiguous quadrat sampling method and power-law model.We compared the estimated parameters(the intercept c and the slope z)of the power-law models among different plant life forms,tested whether the SAR curve forms varied with sampling starting location,and assessed the effect of environmental heterogeneity accumulating with sampling area on curve variation.We found a wider range of variations in the SARs.The estimated c,z-values of power SAR were higher for the herbaceous plants than for the woody plants.A wider variation of SARs for the herbaceous plants than those for the woody plants.The selection of sampling starting location affected the SAR curve forms because of the roles of soil and topographic heterogeneity.We concluded that environmental heterogeneity regulates SAR curves sampled from different starting locations through spatial distribution of plant life forms.Thus,we recommend considering the design of sampling starting location when constructing SAR curves,especially in a heterogeneous habitat with unrandom distribution patterns of species.
基金Supported partially by the National Funds of Natural Sciences, 7860013
文摘In July of 1987, the Sampling Survey of Children's Situation was conducted in 9 provincesautonomous regions of China. A stratified two--stage cluster sampling plan was designed for thesurvey. The paper presents the methods of stratification, selecting n=2 PSU's (cities/counties) withunequal probabilities without replacement in each stratum and selecting residents/village committeein each sampled city/county. All formulae of estimating population characteristics (especiallypopulation totals and the ratios of two totals), and estimating variances of those estimators aregiven. Finally, we analyse the precision of the survey preliminarily from the result of dataprocessing.
文摘Monodisperse aerosols are essential in many applications, such as filter testing, aerosol instrument calibration, and experiments for validating models. This paper describes the design principle, construction, and performance of a monodisperse-aerosol generation system that comprises an atomizer, virtual impactor, microcontroller-based isokinetic probe, wind tunnel, and velocity measurement device. The size distribution of the produced monodisperse aerosols was determined by an optical particle counter. The effects of atomizer characteristics, the rates of minor and major flow, and solution criteria were investigated. It was found that all these parameters affect the generation of monodisperse aerosol. Finally, the expected geometric standard deviation (〈1.25) of monodisperse aerosol particles was obtained with the most suitable atomizer for 10% oleic acid in ethyl alcohol solution with 5%-15% minor flow, where the ratio between the nozzle-to-probe distance and acceleration-nozzle-exit diameter was 0.66. The con- structed monodisperse-aerosol-generation system can be used for instrumental calibration and aerosol research.
文摘The rational design of the sample cell may improve the sensitivity of surface-enhanced Raman scattering (SERS) detection in a high degree. Finite difference time domain (FDTD) simulations of the configuration of Ag film-Ag particles illuminated by plane wave and evanescent wave are performed to provide physical insight for design of the sample cell. Numerical solutions indicate that the sample cell can provide more "hot spots" and the massive field intensity enhancement occurs in these "hot spots". More information on the nanometer character of the sample can be got because of gradient-field Raman (GFR) of evanescent wave. OCIS codes: 290.5860, 240.0310, 240.6680, 999.9999 (surface-enhanced Raman scattering).
文摘Adaptive cluster sampling (ACS) has been widely used for data collection of environment and natural resources. However, the randomness of its final sample size often impedes the use of this method. To control the final sample sizes, in this study, a k-step ACS based on Horvitz-Thompson (HT) estimator was developed and an unbiased estimator was derived. The k-step ACS-HT was assessed first using a simulated example and then using a real survey for numbers of plants for three species that were characterized by clustered and patchily spatial distributions. The effectiveness of this sampling design method was assessed in comparison with ACS Hansen-Hurwitz (ACS-HH) and ACS- HT estimators, and k-step ACS-HT estimator. The effectiveness of using different k- step sizes was also compared. The results showed that k-step ACS^HT estimator was most effective and ACS-HH was the least. Moreover, stable sample mean and variance estimates could be obtained after a certain number of steps, but depending on plant species, k-step ACS without replacement was slightly more effective than that with replacement. In k-step ACS, the variance estimate of one-step ACS is much larger than other k-step ACS (k 〉 1), but it is smaller than ACS. This implies that k-step ACS is more effective than traditional ACS, besides, the final sample size can be controlled easily in population with big clusters.
文摘Recent noteworthy developments in the field of two-dimensional(2D) correlation spectroscopy are reviewed.2D correlation spectroscopy has become a very popular tool due to its versatility and relative ease of use.The technique utilizes a spectroscopic or other analytical probe from a number of selections for a broad range of sample systems by employing different types of external perturbations to induce systematic variations in intensities of spectra.Such spectral intensity variations are then converted into2 D spectra by a form of correlation analysis for subsequent interpretation.Many different types of 2D correlation approaches have been proposed.In particular,2D hetero-correlation and multiple perturbation correlation analyses,including orthogonal sample design scheme,are discussed in this review.Additional references to other important developments in the field of 2D correlation spectroscopy,such as projection correlation and codistribution analysis,were also provided.
基金partially supported by the National Natural Science Foundation of China(Nos.61333001 and 61473099)
文摘In the interception engagement,if the target movement information is not accurate enough for the mid-course guidance of intercepting missiles,the interception mission may fail as a result of large handover errors.This paper proposes a novel cooperative mid-course guidance scheme for multiple missiles to intercept a target under the condition of large detection errors.Under this scheme,the launch and interception moments are staggered for different missiles.The earlier launched missiles can obtain a relatively accurate detection to the target during their terminal guidance,based on which the latter missiles are permitted to eliminate the handover error in the mid-course guidance.A significant merit of this scheme is that the available resources are fully exploited and less missiles are needed to achieve the interception mission.To this end,first,the design of cooperative handover parameters is formulated as an optimization problem.Then,an algorithm based on Monte Carlo sampling and stochastic approximation is proposed to solve this optimization problem,and the convergence of the algorithm is proved as well.Finally,simulation experiments are carried out to validate the effectiveness of the proposed cooperative scheme and algorithm.