The longitudinal dispersion of the projectile in shooting tests of two-dimensional trajectory corrections fused with fixed canards is extremely large that it sometimes exceeds the correction ability of the correction ...The longitudinal dispersion of the projectile in shooting tests of two-dimensional trajectory corrections fused with fixed canards is extremely large that it sometimes exceeds the correction ability of the correction fuse actuator.The impact point easily deviates from the target,and thus the correction result cannot be readily evaluated.However,the cost of shooting tests is considerably high to conduct many tests for data collection.To address this issue,this study proposes an aiming method for shooting tests based on small sample size.The proposed method uses the Bootstrap method to expand the test data;repeatedly iterates and corrects the position of the simulated theoretical impact points through an improved compatibility test method;and dynamically adjusts the weight of the prior distribution of simulation results based on Kullback-Leibler divergence,which to some extent avoids the real data being"submerged"by the simulation data and achieves the fusion Bayesian estimation of the dispersion center.The experimental results show that when the simulation accuracy is sufficiently high,the proposed method yields a smaller mean-square deviation in estimating the dispersion center and higher shooting accuracy than those of the three comparison methods,which is more conducive to reflecting the effect of the control algorithm and facilitating test personnel to iterate their proposed structures and algorithms.;in addition,this study provides a knowledge base for further comprehensive studies in the future.展开更多
Sample size determination typically relies on a power analysis based on a frequentist conditional approach. This latter can be seen as a particular case of the two-priors approach, which allows to build four distinct ...Sample size determination typically relies on a power analysis based on a frequentist conditional approach. This latter can be seen as a particular case of the two-priors approach, which allows to build four distinct power functions to select the optimal sample size. We revise this approach when the focus is on testing a single binomial proportion. We consider exact methods and introduce a conservative criterion to account for the typical non-monotonic behavior of the power functions, when dealing with discrete data. The main purpose of this paper is to present a Shiny App providing a user-friendly, interactive tool to apply these criteria. The app also provides specific tools to elicit the analysis and the design prior distributions, which are the core of the two-priors approach.展开更多
The precise and accurate knowledge of genetic parameters is a prerequisite for making efficient selection strategies in breeding programs.A number of estimators of heritability about important economic traits in many ...The precise and accurate knowledge of genetic parameters is a prerequisite for making efficient selection strategies in breeding programs.A number of estimators of heritability about important economic traits in many marine mollusks are available in the literature,however very few research have evaluated about the accuracy of genetic parameters estimated with different family structures.Thus,in the present study,the effect of parent sample size for estimating the precision of genetic parameters of four growth traits in clam M.meretrix by factorial designs were analyzed through restricted maximum likelihood(REML) and Bayesian.The results showed that the average estimated heritabilities of growth traits obtained from REML were 0.23-0.32 for 9 and 16 full-sib families and 0.19-0.22 for 25 full-sib families.When using Bayesian inference,the average estimated heritabilities were0.11-0.12 for 9 and 16 full-sib families and 0.13-0.16 for 25 full-sib families.Compared with REML,Bayesian got lower heritabilities,but still remained at a medium level.When the number of parents increased from 6 to 10,the estimated heritabilities were more closed to 0.20 in REML and 0.12 in Bayesian inference.Genetic correlations among traits were positive and high and had no significant difference between different sizes of designs.The accuracies of estimated breeding values from the 9 and 16 families were less precise than those from 25 families.Our results provide a basic genetic evaluation for growth traits and should be useful for the design and operation of a practical selective breeding program in the clam M.meretrix.展开更多
Sample size can be a key design feature that not only affects the probability of a trial's success but also determines the duration and feasibility of a trial. If an investigational drug is expected to be effective a...Sample size can be a key design feature that not only affects the probability of a trial's success but also determines the duration and feasibility of a trial. If an investigational drug is expected to be effective and address unmet medical needs of an orphan disease, where the accrual period may require many years with a large sample size to detect a minimal clinically relevant treatment effect, a minimum sample size may be set to maintain nominal power. In limited situations such as this, there may be a need for flexibility in the initial and final sample sizes; thus, it is useful to consider the utility of adaptive sample size designs that use sample size re-estimation or group sequential design. In this paper, we propose a new adaptive performance measure to consider the utility of an adaptive sample size design in a trial simulation. Considering that previously proposed sample size re-estimation methods do not take into account errors in estimation based on interim results, we propose Bayesian sample size re-estimation criteria that take into account prior information on treatment effect, and then, we assess its operating characteristics in a simulation study. We also present a review example of sample size re-estimation mainly based on published paper and review report in Pharmaceuticals and Medical Devices Agency (PMDA).展开更多
The development of a core collection could enhance the utilization of germplasm collections in crop improvement programs and simplify their management. Selection of an appropriate sampling strategy is an important pre...The development of a core collection could enhance the utilization of germplasm collections in crop improvement programs and simplify their management. Selection of an appropriate sampling strategy is an important prerequisite to construct a core collection with appropriate size in order to adequately represent the genetic spectrum and maximally capture the genetic diversity in available crop collections. The present study was initiated to construct nested core collections to determine the appropriate sample size to represent the genetic diversity of rice landrace collection based on 15 quantitative traits and 34 qualitative traits of 2 262 rice accessions. The results showed that 50-225 nested core collections, whose sampling rate was 2.2%-9.9%, were sufficient to maintain the maximum genetic diversity of the initial collections. Of these, 150 accessions (6.6%) could capture the maximal genetic diversity of the initial collection. Three data types, i.e. qualitative traits (QT1), quantitative traits (QT2) and integrated qualitative and quantitative traits (QTT), were compared for their efficiency in constructing core collections based on the weighted pair-group average method combined with stepwise clustering and preferred sampling on adjusted Euclidean distances. Every combining scheme constructed eight rice core collections (225, 200, 175, 150, 125, 100, 75 and 50). The results showed that the QTT data was the best in constructing a core collection as indicated by the genetic diversity of core collections. A core collection constructed only on the information of QT1 could not represent the initial collection effectively. QTT should be used together to construct a productive core collection.展开更多
In order to investigate the effect of sample size on the dynamic torsional behaviour of the 2A12 aluminium alloy. In this paper, torsional split Hopkinson bar tests are conducted on this alloy with different sample di...In order to investigate the effect of sample size on the dynamic torsional behaviour of the 2A12 aluminium alloy. In this paper, torsional split Hopkinson bar tests are conducted on this alloy with different sample dimensions. It is found that with the decreasing gauge length and thickness, the tested yield strength increases. However, the sample innerlouter diameter has little effect on the dynamic torsional behaviour. Based on the finite element method, the stress states in the alloy with different sample sizes are analysed. Due to the effect of stress concentration zone (SCZ), the shorter sample has a higher yield stress. Furthermore, the stress distributes more uniformly in the thinner sample, which leads to the higher tested yield stress. According to the experimental and simulation analysis, some suggestions on choosing the sample size are given as well.展开更多
Reliability assessment of the braking system in a high?speed train under small sample size and zero?failure data is veryimportant for safe operation. Traditional reliability assessment methods are only performed well ...Reliability assessment of the braking system in a high?speed train under small sample size and zero?failure data is veryimportant for safe operation. Traditional reliability assessment methods are only performed well under conditions of large sample size and complete failure data,which lead to large deviation under conditions of small sample size and zero?failure data. To improve this problem,a new Bayesian method is proposed. Based on the characteristics of the solenoid valve in the braking system of a high?speed train,the modified Weibull distribution is selected to describe the failure rate over the entire lifetime. Based on the assumption of a binomial distribution for the failure probability at censored time,a concave method is employed to obtain the relationships between accumulation failure prob?abilities. A numerical simulation is performed to compare the results of the proposed method with those obtained from maximum likelihood estimation,and to illustrate that the proposed Bayesian model exhibits a better accuracy for the expectation value when the sample size is less than 12. Finally,the robustness of the model is demonstrated by obtaining the reliability indicators for a numerical case involving the solenoid valve of the braking system,which shows that the change in the reliability and failure rate among the di erent hyperparameters is small. The method is provided to avoid misleading of subjective information and improve accuracy of reliability assessment under condi?tions of small sample size and zero?failure data.展开更多
Accurate prediction of the internal corrosion rates of oil and gas pipelines could be an effective way to prevent pipeline leaks.In this study,a proposed framework for predicting corrosion rates under a small sample o...Accurate prediction of the internal corrosion rates of oil and gas pipelines could be an effective way to prevent pipeline leaks.In this study,a proposed framework for predicting corrosion rates under a small sample of metal corrosion data in the laboratory was developed to provide a new perspective on how to solve the problem of pipeline corrosion under the condition of insufficient real samples.This approach employed the bagging algorithm to construct a strong learner by integrating several KNN learners.A total of 99 data were collected and split into training and test set with a 9:1 ratio.The training set was used to obtain the best hyperparameters by 10-fold cross-validation and grid search,and the test set was used to determine the performance of the model.The results showed that theMean Absolute Error(MAE)of this framework is 28.06%of the traditional model and outperforms other ensemblemethods.Therefore,the proposed framework is suitable formetal corrosion prediction under small sample conditions.展开更多
The hadal zone(ocean depths of 6 – 11 km) is one of the least-understood habitats on Earth because of its extreme conditions such as high pressure, darkness, and low temperature. With the development of deep-sea vehi...The hadal zone(ocean depths of 6 – 11 km) is one of the least-understood habitats on Earth because of its extreme conditions such as high pressure, darkness, and low temperature. With the development of deep-sea vehicles such as China's 7000 m manned submersible Jiaolong, abyssal science has received greater attention. For decades, gravity-piston corers have been widely used to collect loose subsea-sediment long-core samples. However, the weight and length of the gravity sampler cables and the operating environment limit sampling capacity at full ocean depths. Therefore, a new self-floating sediment sampler with a spring-loaded auto-trigger release and that incorporates characteristics from traditional gravity-driven samplers is designed. This study analyzes the process by which a gravity-piston corer penetrates the sediment and the factors that affect it. A formula for obtaining the penetration depth is deduced. A method of optimizing the sampling depth is then developed based on structure design and parametric factor modeling. The parameters considered in the modeling include the sampling depth, balance weight, ultimate stress friction coefficient, dimensions of the sampler, and material properties. Thus, a new deep-sea floating parametric sampler designed based on virtual prototyping is proposed. Accurate values for all the design factors are derived from calculations based on the conservation of energy with penetration depth, analyses of the factors affecting the penetration depth, and analyses of the pressure bar stability. Finally, experimental data are used to verify the penetration-depth function and to provide theoretical guidance for the design of sediment samplers.展开更多
This paper investigates the tolerable sample size needed for Ordinary Least Square (OLS) Estimator to be used when there is presence of Multicollinearity among the exogenous variables of a linear regression model. A r...This paper investigates the tolerable sample size needed for Ordinary Least Square (OLS) Estimator to be used when there is presence of Multicollinearity among the exogenous variables of a linear regression model. A regression model with constant term (β0) and two independent variables (with β1 and β2 as their respective regression coefficients) that exhibit multicollinearity was considered. A Monte Carlo study of 1000 trials was conducted at eight levels of multicollinearity (0, 0.25, 0.5, 0.7, 0.75, 0.8, 0.9 and 0.99) and sample sizes (10, 20, 40, 80, 100, 150, 250 and 500). At each specification, the true regression coefficients were set at unity while 1.5, 2.0 and 2.5 were taken as the hypothesized value. The power value rate was obtained at every multicollinearity level for the aforementioned sample sizes. Therefore, whether the hypothesized values highly depart from the true values or not once the multicollinearity level is very high (i.e. 0.99), the sample size needed to work with in order to have an error free estimation or the inference result must be greater than five hundred.展开更多
The size effects of microstructure of lattice materials on structural analysis and minimum weight design are studied with extented multiscale finite element method(EMsFEM) in the paper. With the same volume of base ...The size effects of microstructure of lattice materials on structural analysis and minimum weight design are studied with extented multiscale finite element method(EMsFEM) in the paper. With the same volume of base material and configuration, the structural displacement and maximum axial stress of micro-rod of lattice structures with different sizes of microstructure are analyzed and compared.It is pointed out that different from the traditional mathematical homogenization method, EMsFEM is suitable for analyzing the structures which is constituted with lattice materials and composed of quantities of finite-sized micro-rods.The minimum weight design of structures composed of lattice material is studied with downscaling calculation of EMsFEM under stress constraints of micro-rods. The optimal design results show that the weight of the structure increases with the decrease of the size of basic sub-unit cells. The paper presents a new approach for analysis and optimization of lattice materials in complex engineering constructions.展开更多
This study used Ecopath model of the Jiaozhou Bay as an example to evaluate the effect of stomach sample size of three fish species on the projection of this model. The derived ecosystem indices were classified into t...This study used Ecopath model of the Jiaozhou Bay as an example to evaluate the effect of stomach sample size of three fish species on the projection of this model. The derived ecosystem indices were classified into three categories:(1) direct indices, like the trophic level of species, influenced by stomach sample size directly;(2)indirect indices, like ecology efficiency(EE) of invertebrates, influenced by the multiple prey-predator relationships;and(3) systemic indices, like total system throughout(TST), describing the status of the whole ecosystem. The influences of different stomach sample sizes on these indices were evaluated. The results suggest that systemic indices of the ecosystem model were robust to stomach sample sizes, whereas specific indices related to species were indicated to be with low accuracy and precision when stomach samples were insufficient.The indices became more uncertain when the stomach sample sizes varied for more species. This study enhances the understanding of how the quality of diet composition data influences ecosystem modeling outputs. The results can also guide the design of stomach content analysis for developing ecosystem models.展开更多
The sample preparation of samples conlaining bovine serum albumin(BSA),e.g..as used in transdermal Franz diffusion cell(FDC) solutions,was evaluated using an analytical qualily-by-design(QbD)approach.Traditional...The sample preparation of samples conlaining bovine serum albumin(BSA),e.g..as used in transdermal Franz diffusion cell(FDC) solutions,was evaluated using an analytical qualily-by-design(QbD)approach.Traditional precipitation of BSA by adding an equal volume of organic solvent,often successfully used with conventional HPLC-PDA,was found insufficiently robust when novel fused-core HPLC and/or UPLC-MS methods were used.In this study,three factors(acetonitrile(%).formic acid(%) and boiling time(min)) were included in the experimental design to determine an optimal and more suitable sample treatment of BSAcontaining FDC solutions.Using a QbD and Derringer desirability(D) approach,combining BSA loss,dilution factor and variability,we constructed an optimal working space with the edge of failure defined as D〈0.9.The design space is modelled and is confirmed to have an ACN range of 83 ± 3% and FA content of 1 ±0.25%.展开更多
Objective To develop methods for determining a suitable sample size for bioequivalence assessment of generic topical ophthalmic drugs using crossover design with serial sampling schemes.Methods The power functions of ...Objective To develop methods for determining a suitable sample size for bioequivalence assessment of generic topical ophthalmic drugs using crossover design with serial sampling schemes.Methods The power functions of the Fieller-type confidence interval and the asymptotic confidence interval in crossover designs with serial-sampling data are here derived.Simulation studies were conducted to evaluate the derived power functions.Results Simulation studies show that two power functions can provide precise power estimates when normality assumptions are satisfied and yield conservative estimates of power in cases when data are log-normally distributed.The intra-correlation showed a positive correlation with the power of the bioequivalence test.When the expected ratio of the AUCs was less than or equal to 1, the power of the Fieller-type confidence interval was larger than the asymptotic confidence interval.If the expected ratio of the AUCs was larger than 1, the asymptotic confidence interval had greater power.Sample size can be calculated through numerical iteration with the derived power functions.Conclusion The Fieller-type power function and the asymptotic power function can be used to determine sample sizes of crossover trials for bioequivalence assessment of topical ophthalmic drugs.展开更多
In this study, a design of experiments (DoE) approach was used to develop a PLA open-cell foam morphology using the compression molding technique. The effect of three molding parameters (foaming time, mold opening tem...In this study, a design of experiments (DoE) approach was used to develop a PLA open-cell foam morphology using the compression molding technique. The effect of three molding parameters (foaming time, mold opening temperature, and weight concentration of the ADA blowing agent) on the cellular structure was investigated. A regression equation relating the average cell size to the above three processing parameters was developed from the DoE and the analysis of variance (ANOVA) was used to find the best dimensional fitting parameters based on the experimental data. With the help of the DoE technique, we were able to develop various foam morphologies having different average cell size distribution levels, which is important in the development of open-cell PLA scaffolds for bone regeneration for which the control of cell morphology is crucial for osteoblasts proliferation. For example, at a constant ADA weight concentration of 5.95 wt%, we were able to develop a narrow average cell size distribution ranging between 275 and 300 μm by varying the mold opening temperature between 106°C and 112°C, while maintaining the foaming time constant at 8 min, or by varying the mold foaming time between 6 and 11 min and maintaining the mold opening temperature at 109°C.展开更多
After finishing 102 replicate constant amplitude crack initiation and growth tests on Ly12-CZ aluminum alloy plate, a statistical investigation of the fatigue crack initiation and growth process is conducted in this p...After finishing 102 replicate constant amplitude crack initiation and growth tests on Ly12-CZ aluminum alloy plate, a statistical investigation of the fatigue crack initiation and growth process is conducted in this paper. According to the post-mortem fractographic examination by scanning electron microscopy (SEM), some qualitative observations of the spacial correlation among fatigue striations are developed to reveal the statistical nature of material intrinsic inhomogeneity during the crack growth process. From the test data, an engineering division between crack initiation and growth is defined as the upper limit of small crack. The distributions of crack initiation life N-i, growth life N, and the statistical characteristics of crack growth rate da/dN are also investigated. It is hoped that the work will provide a solid test basis for the study of probabilistic fatigue, probabilistic fracture mechanics, fatigue reliability and its engineering applications.展开更多
In the 6th edition of the Chinese Space Trajectory Design Competition held in 2014, a near-Earth asteroid sample-return trajectory design problem was released, in which the motion of the spacecraft is modeled in multi...In the 6th edition of the Chinese Space Trajectory Design Competition held in 2014, a near-Earth asteroid sample-return trajectory design problem was released, in which the motion of the spacecraft is modeled in multi-body dynamics, considering the gravitational forces of the Sun, Earth, and Moon. It is proposed that an electric-propulsion spacecraft initially parking in a circular 200-kin-altitude low Earth orbit is expected to rendezvous with an asteroid and carry as much sample as possible back to the Earth in a 10-year time frame. The team from the Technology and Engineering Center for Space Utilization, Chinese Academy of Sciences has reported a solution with an asteroid sample mass of 328 tons, which is ranked first in the competition. In this article, we will present our design and optimization methods, primarily including overall analysis, target selection, escape from and capture by the Earth-Moon system, and optimization of impulsive and low-thrust trajectories that are modeled in multi-body dynamics. The orbital resonance concept and lunar gravity assists are considered key techniques employed for trajectory design. The reported solution, preliminarily revealing the feasibility of returning a hundreds-of-tons asteroid or asteroid sample, envisions future space missions relating to near-Earth asteroid exploration.展开更多
Sample size is very important in statistical research because it is not too small or too large. Given significant level α, the sample size is calculated based on the z-value and pre-defined error. Such error is defin...Sample size is very important in statistical research because it is not too small or too large. Given significant level α, the sample size is calculated based on the z-value and pre-defined error. Such error is defined based on the previous experiment or other study or it can be determined subjectively by specialist, which may cause incorrect estimation. Therefore, this research proposes an objective method to estimate the sample size without pre-defining the error. Given an available sample X = {X1, X2, ..., Xn}, the error is calculated via the iterative process in which sample X is re-sampled many times. Moreover, after the sample size is estimated completely, it can be used to collect a new sample in order to estimate new sample size and so on.展开更多
Sample size justification is a very crucial part in the design of clinical trials. In this paper, the authors derive a new formula to calculate the sample size for a binary outcome given one of the three popular indic...Sample size justification is a very crucial part in the design of clinical trials. In this paper, the authors derive a new formula to calculate the sample size for a binary outcome given one of the three popular indices of risk difference.The sample size based on the absolute difference is the fundamental one, which can be easily used to derive sample size given the risk ratio or OR.展开更多
The size and form of sampling units-SU have always been variables considered in planning and structuring forest inventories, being performed in forests or in plantations. The experimental work outlined to deal with th...The size and form of sampling units-SU have always been variables considered in planning and structuring forest inventories, being performed in forests or in plantations. The experimental work outlined to deal with the problem was conducted in an area of araucaria forest, in Sao Joao de Triunfo, PR, Brazil. The forms of sampling units circle, square and rectangular were evaluated, whose areas ranged from 200 m2 to 1000 m2. Time was recorded using a stopwatch and computed separately for locomotion and measurement. The power model was used to adjust the relations of the times of locomotion and measurement, as well as a hyperbolic one for the coefficient of variation, all taking as function of the SUs sizes. To achieve the analytical solution for the optimum size of the SU, it was necessary simulating the behavior of the three functions until the size of 10,000 m2. By taking the derivative of the combined function it was found the maximum point, which allowed optimizing the size of the SU in 600 m2 for the structured experimental conditions. This result proved the formulated hypothesis, performing even a critical analysis of the inclusion of other relevant variables, such as size of the area to be inventoried, number of SUs performed in one day of work, average distance between the SUs, average speeds for locomotion between SUs and for taking all respective measurements.展开更多
基金the National Natural Science Foundation of China(Grant No.61973033)Preliminary Research of Equipment(Grant No.9090102010305)for funding the experiments。
文摘The longitudinal dispersion of the projectile in shooting tests of two-dimensional trajectory corrections fused with fixed canards is extremely large that it sometimes exceeds the correction ability of the correction fuse actuator.The impact point easily deviates from the target,and thus the correction result cannot be readily evaluated.However,the cost of shooting tests is considerably high to conduct many tests for data collection.To address this issue,this study proposes an aiming method for shooting tests based on small sample size.The proposed method uses the Bootstrap method to expand the test data;repeatedly iterates and corrects the position of the simulated theoretical impact points through an improved compatibility test method;and dynamically adjusts the weight of the prior distribution of simulation results based on Kullback-Leibler divergence,which to some extent avoids the real data being"submerged"by the simulation data and achieves the fusion Bayesian estimation of the dispersion center.The experimental results show that when the simulation accuracy is sufficiently high,the proposed method yields a smaller mean-square deviation in estimating the dispersion center and higher shooting accuracy than those of the three comparison methods,which is more conducive to reflecting the effect of the control algorithm and facilitating test personnel to iterate their proposed structures and algorithms.;in addition,this study provides a knowledge base for further comprehensive studies in the future.
文摘Sample size determination typically relies on a power analysis based on a frequentist conditional approach. This latter can be seen as a particular case of the two-priors approach, which allows to build four distinct power functions to select the optimal sample size. We revise this approach when the focus is on testing a single binomial proportion. We consider exact methods and introduce a conservative criterion to account for the typical non-monotonic behavior of the power functions, when dealing with discrete data. The main purpose of this paper is to present a Shiny App providing a user-friendly, interactive tool to apply these criteria. The app also provides specific tools to elicit the analysis and the design prior distributions, which are the core of the two-priors approach.
基金The National High Technology Research and Development Program(863 program)of China under contract No.2012AA10A410the Zhejiang Science and Technology Project of Agricultural Breeding under contract No.2012C12907-4the Scientific and Technological Innovation Project financially supported by Qingdao National Laboratory for Marine Science and Technology under contract No.2015ASKJ02
文摘The precise and accurate knowledge of genetic parameters is a prerequisite for making efficient selection strategies in breeding programs.A number of estimators of heritability about important economic traits in many marine mollusks are available in the literature,however very few research have evaluated about the accuracy of genetic parameters estimated with different family structures.Thus,in the present study,the effect of parent sample size for estimating the precision of genetic parameters of four growth traits in clam M.meretrix by factorial designs were analyzed through restricted maximum likelihood(REML) and Bayesian.The results showed that the average estimated heritabilities of growth traits obtained from REML were 0.23-0.32 for 9 and 16 full-sib families and 0.19-0.22 for 25 full-sib families.When using Bayesian inference,the average estimated heritabilities were0.11-0.12 for 9 and 16 full-sib families and 0.13-0.16 for 25 full-sib families.Compared with REML,Bayesian got lower heritabilities,but still remained at a medium level.When the number of parents increased from 6 to 10,the estimated heritabilities were more closed to 0.20 in REML and 0.12 in Bayesian inference.Genetic correlations among traits were positive and high and had no significant difference between different sizes of designs.The accuracies of estimated breeding values from the 9 and 16 families were less precise than those from 25 families.Our results provide a basic genetic evaluation for growth traits and should be useful for the design and operation of a practical selective breeding program in the clam M.meretrix.
文摘Sample size can be a key design feature that not only affects the probability of a trial's success but also determines the duration and feasibility of a trial. If an investigational drug is expected to be effective and address unmet medical needs of an orphan disease, where the accrual period may require many years with a large sample size to detect a minimal clinically relevant treatment effect, a minimum sample size may be set to maintain nominal power. In limited situations such as this, there may be a need for flexibility in the initial and final sample sizes; thus, it is useful to consider the utility of adaptive sample size designs that use sample size re-estimation or group sequential design. In this paper, we propose a new adaptive performance measure to consider the utility of an adaptive sample size design in a trial simulation. Considering that previously proposed sample size re-estimation methods do not take into account errors in estimation based on interim results, we propose Bayesian sample size re-estimation criteria that take into account prior information on treatment effect, and then, we assess its operating characteristics in a simulation study. We also present a review example of sample size re-estimation mainly based on published paper and review report in Pharmaceuticals and Medical Devices Agency (PMDA).
基金supported by the National Natural Science Foundation of China (Grant No. 30700494)the Principal Fund of South China Agricultural University, China (Grant No. 2003K053)
文摘The development of a core collection could enhance the utilization of germplasm collections in crop improvement programs and simplify their management. Selection of an appropriate sampling strategy is an important prerequisite to construct a core collection with appropriate size in order to adequately represent the genetic spectrum and maximally capture the genetic diversity in available crop collections. The present study was initiated to construct nested core collections to determine the appropriate sample size to represent the genetic diversity of rice landrace collection based on 15 quantitative traits and 34 qualitative traits of 2 262 rice accessions. The results showed that 50-225 nested core collections, whose sampling rate was 2.2%-9.9%, were sufficient to maintain the maximum genetic diversity of the initial collections. Of these, 150 accessions (6.6%) could capture the maximal genetic diversity of the initial collection. Three data types, i.e. qualitative traits (QT1), quantitative traits (QT2) and integrated qualitative and quantitative traits (QTT), were compared for their efficiency in constructing core collections based on the weighted pair-group average method combined with stepwise clustering and preferred sampling on adjusted Euclidean distances. Every combining scheme constructed eight rice core collections (225, 200, 175, 150, 125, 100, 75 and 50). The results showed that the QTT data was the best in constructing a core collection as indicated by the genetic diversity of core collections. A core collection constructed only on the information of QT1 could not represent the initial collection effectively. QTT should be used together to construct a productive core collection.
基金Financial support is from the NSFC(Grant Nos.11602257,11472257,11272300,11572299)funded by the key subject"Computational Solid Mechanics"of the China Academy of Engineering Physics
文摘In order to investigate the effect of sample size on the dynamic torsional behaviour of the 2A12 aluminium alloy. In this paper, torsional split Hopkinson bar tests are conducted on this alloy with different sample dimensions. It is found that with the decreasing gauge length and thickness, the tested yield strength increases. However, the sample innerlouter diameter has little effect on the dynamic torsional behaviour. Based on the finite element method, the stress states in the alloy with different sample sizes are analysed. Due to the effect of stress concentration zone (SCZ), the shorter sample has a higher yield stress. Furthermore, the stress distributes more uniformly in the thinner sample, which leads to the higher tested yield stress. According to the experimental and simulation analysis, some suggestions on choosing the sample size are given as well.
基金Supported by National Natural Science Foundation of China(Grant No.51175028)Great Scholars Training Project(Grant No.CIT&TCD20150312)Beijing Recognized Talent Project(Grant No.2014018)
文摘Reliability assessment of the braking system in a high?speed train under small sample size and zero?failure data is veryimportant for safe operation. Traditional reliability assessment methods are only performed well under conditions of large sample size and complete failure data,which lead to large deviation under conditions of small sample size and zero?failure data. To improve this problem,a new Bayesian method is proposed. Based on the characteristics of the solenoid valve in the braking system of a high?speed train,the modified Weibull distribution is selected to describe the failure rate over the entire lifetime. Based on the assumption of a binomial distribution for the failure probability at censored time,a concave method is employed to obtain the relationships between accumulation failure prob?abilities. A numerical simulation is performed to compare the results of the proposed method with those obtained from maximum likelihood estimation,and to illustrate that the proposed Bayesian model exhibits a better accuracy for the expectation value when the sample size is less than 12. Finally,the robustness of the model is demonstrated by obtaining the reliability indicators for a numerical case involving the solenoid valve of the braking system,which shows that the change in the reliability and failure rate among the di erent hyperparameters is small. The method is provided to avoid misleading of subjective information and improve accuracy of reliability assessment under condi?tions of small sample size and zero?failure data.
基金supported by the National Natural Science Foundation of China(Grant No.52174062).
文摘Accurate prediction of the internal corrosion rates of oil and gas pipelines could be an effective way to prevent pipeline leaks.In this study,a proposed framework for predicting corrosion rates under a small sample of metal corrosion data in the laboratory was developed to provide a new perspective on how to solve the problem of pipeline corrosion under the condition of insufficient real samples.This approach employed the bagging algorithm to construct a strong learner by integrating several KNN learners.A total of 99 data were collected and split into training and test set with a 9:1 ratio.The training set was used to obtain the best hyperparameters by 10-fold cross-validation and grid search,and the test set was used to determine the performance of the model.The results showed that theMean Absolute Error(MAE)of this framework is 28.06%of the traditional model and outperforms other ensemblemethods.Therefore,the proposed framework is suitable formetal corrosion prediction under small sample conditions.
基金jointly supported by the Stable Supporting Fund of Science and Technology on Underwater Vehicle Technology (No. JCKYS2019604SXJQR-06)the National Natural Science Foundation of China-Marine Science Research Center of Shandong Provincial Government Joint Funding Project (No. U1606401)+3 种基金the National Natural Science Foundation of China (No. 61603108)the Taishan Scholar Project Funding (No. tspd20161007)the National Key Research and Development Plan (Nos. 2016YFC03007042017YFC030660)。
文摘The hadal zone(ocean depths of 6 – 11 km) is one of the least-understood habitats on Earth because of its extreme conditions such as high pressure, darkness, and low temperature. With the development of deep-sea vehicles such as China's 7000 m manned submersible Jiaolong, abyssal science has received greater attention. For decades, gravity-piston corers have been widely used to collect loose subsea-sediment long-core samples. However, the weight and length of the gravity sampler cables and the operating environment limit sampling capacity at full ocean depths. Therefore, a new self-floating sediment sampler with a spring-loaded auto-trigger release and that incorporates characteristics from traditional gravity-driven samplers is designed. This study analyzes the process by which a gravity-piston corer penetrates the sediment and the factors that affect it. A formula for obtaining the penetration depth is deduced. A method of optimizing the sampling depth is then developed based on structure design and parametric factor modeling. The parameters considered in the modeling include the sampling depth, balance weight, ultimate stress friction coefficient, dimensions of the sampler, and material properties. Thus, a new deep-sea floating parametric sampler designed based on virtual prototyping is proposed. Accurate values for all the design factors are derived from calculations based on the conservation of energy with penetration depth, analyses of the factors affecting the penetration depth, and analyses of the pressure bar stability. Finally, experimental data are used to verify the penetration-depth function and to provide theoretical guidance for the design of sediment samplers.
文摘This paper investigates the tolerable sample size needed for Ordinary Least Square (OLS) Estimator to be used when there is presence of Multicollinearity among the exogenous variables of a linear regression model. A regression model with constant term (β0) and two independent variables (with β1 and β2 as their respective regression coefficients) that exhibit multicollinearity was considered. A Monte Carlo study of 1000 trials was conducted at eight levels of multicollinearity (0, 0.25, 0.5, 0.7, 0.75, 0.8, 0.9 and 0.99) and sample sizes (10, 20, 40, 80, 100, 150, 250 and 500). At each specification, the true regression coefficients were set at unity while 1.5, 2.0 and 2.5 were taken as the hypothesized value. The power value rate was obtained at every multicollinearity level for the aforementioned sample sizes. Therefore, whether the hypothesized values highly depart from the true values or not once the multicollinearity level is very high (i.e. 0.99), the sample size needed to work with in order to have an error free estimation or the inference result must be greater than five hundred.
基金supported by the National Natural Science Foundation of China(11372060,10902018,91216201,and 11326005)the National Basic Research Program of China(2011CB610304)the Major National Science and Technology Project(2011ZX02403-002)
文摘The size effects of microstructure of lattice materials on structural analysis and minimum weight design are studied with extented multiscale finite element method(EMsFEM) in the paper. With the same volume of base material and configuration, the structural displacement and maximum axial stress of micro-rod of lattice structures with different sizes of microstructure are analyzed and compared.It is pointed out that different from the traditional mathematical homogenization method, EMsFEM is suitable for analyzing the structures which is constituted with lattice materials and composed of quantities of finite-sized micro-rods.The minimum weight design of structures composed of lattice material is studied with downscaling calculation of EMsFEM under stress constraints of micro-rods. The optimal design results show that the weight of the structure increases with the decrease of the size of basic sub-unit cells. The paper presents a new approach for analysis and optimization of lattice materials in complex engineering constructions.
基金The National Natural Science Foundation of China under contract No.31772852the Fundamental Research Funds for the Central Universities under contract No.201612004。
文摘This study used Ecopath model of the Jiaozhou Bay as an example to evaluate the effect of stomach sample size of three fish species on the projection of this model. The derived ecosystem indices were classified into three categories:(1) direct indices, like the trophic level of species, influenced by stomach sample size directly;(2)indirect indices, like ecology efficiency(EE) of invertebrates, influenced by the multiple prey-predator relationships;and(3) systemic indices, like total system throughout(TST), describing the status of the whole ecosystem. The influences of different stomach sample sizes on these indices were evaluated. The results suggest that systemic indices of the ecosystem model were robust to stomach sample sizes, whereas specific indices related to species were indicated to be with low accuracy and precision when stomach samples were insufficient.The indices became more uncertain when the stomach sample sizes varied for more species. This study enhances the understanding of how the quality of diet composition data influences ecosystem modeling outputs. The results can also guide the design of stomach content analysis for developing ecosystem models.
基金the Special Research Fund of Ghent University(BOF 01D23812 to Lien Taevernier and BOF O1J22510 to Evelien Wynendaele and Professor Bart De Spiegeleer)the Institute for the Promotion of Innovation through Science and Technology in Flanders(IWT 101529 to Matthias D'Hondt)for their financial funding
文摘The sample preparation of samples conlaining bovine serum albumin(BSA),e.g..as used in transdermal Franz diffusion cell(FDC) solutions,was evaluated using an analytical qualily-by-design(QbD)approach.Traditional precipitation of BSA by adding an equal volume of organic solvent,often successfully used with conventional HPLC-PDA,was found insufficiently robust when novel fused-core HPLC and/or UPLC-MS methods were used.In this study,three factors(acetonitrile(%).formic acid(%) and boiling time(min)) were included in the experimental design to determine an optimal and more suitable sample treatment of BSAcontaining FDC solutions.Using a QbD and Derringer desirability(D) approach,combining BSA loss,dilution factor and variability,we constructed an optimal working space with the edge of failure defined as D〈0.9.The design space is modelled and is confirmed to have an ACN range of 83 ± 3% and FA content of 1 ±0.25%.
基金supported by sub-project of National Major Scientific and Technological Special Project of China for ‘Significant New Drugs Development’[2015ZX09501008-004]
文摘Objective To develop methods for determining a suitable sample size for bioequivalence assessment of generic topical ophthalmic drugs using crossover design with serial sampling schemes.Methods The power functions of the Fieller-type confidence interval and the asymptotic confidence interval in crossover designs with serial-sampling data are here derived.Simulation studies were conducted to evaluate the derived power functions.Results Simulation studies show that two power functions can provide precise power estimates when normality assumptions are satisfied and yield conservative estimates of power in cases when data are log-normally distributed.The intra-correlation showed a positive correlation with the power of the bioequivalence test.When the expected ratio of the AUCs was less than or equal to 1, the power of the Fieller-type confidence interval was larger than the asymptotic confidence interval.If the expected ratio of the AUCs was larger than 1, the asymptotic confidence interval had greater power.Sample size can be calculated through numerical iteration with the derived power functions.Conclusion The Fieller-type power function and the asymptotic power function can be used to determine sample sizes of crossover trials for bioequivalence assessment of topical ophthalmic drugs.
文摘In this study, a design of experiments (DoE) approach was used to develop a PLA open-cell foam morphology using the compression molding technique. The effect of three molding parameters (foaming time, mold opening temperature, and weight concentration of the ADA blowing agent) on the cellular structure was investigated. A regression equation relating the average cell size to the above three processing parameters was developed from the DoE and the analysis of variance (ANOVA) was used to find the best dimensional fitting parameters based on the experimental data. With the help of the DoE technique, we were able to develop various foam morphologies having different average cell size distribution levels, which is important in the development of open-cell PLA scaffolds for bone regeneration for which the control of cell morphology is crucial for osteoblasts proliferation. For example, at a constant ADA weight concentration of 5.95 wt%, we were able to develop a narrow average cell size distribution ranging between 275 and 300 μm by varying the mold opening temperature between 106°C and 112°C, while maintaining the foaming time constant at 8 min, or by varying the mold foaming time between 6 and 11 min and maintaining the mold opening temperature at 109°C.
基金The project is supported by the Aeronautic Science Foundation,China
文摘After finishing 102 replicate constant amplitude crack initiation and growth tests on Ly12-CZ aluminum alloy plate, a statistical investigation of the fatigue crack initiation and growth process is conducted in this paper. According to the post-mortem fractographic examination by scanning electron microscopy (SEM), some qualitative observations of the spacial correlation among fatigue striations are developed to reveal the statistical nature of material intrinsic inhomogeneity during the crack growth process. From the test data, an engineering division between crack initiation and growth is defined as the upper limit of small crack. The distributions of crack initiation life N-i, growth life N, and the statistical characteristics of crack growth rate da/dN are also investigated. It is hoped that the work will provide a solid test basis for the study of probabilistic fatigue, probabilistic fracture mechanics, fatigue reliability and its engineering applications.
基金supported by the National Natural Science Foundation of China(Grant11372311)the grant from the State key Laboratory of Astronautic Dynamics(2014-ADL-DW0201)
文摘In the 6th edition of the Chinese Space Trajectory Design Competition held in 2014, a near-Earth asteroid sample-return trajectory design problem was released, in which the motion of the spacecraft is modeled in multi-body dynamics, considering the gravitational forces of the Sun, Earth, and Moon. It is proposed that an electric-propulsion spacecraft initially parking in a circular 200-kin-altitude low Earth orbit is expected to rendezvous with an asteroid and carry as much sample as possible back to the Earth in a 10-year time frame. The team from the Technology and Engineering Center for Space Utilization, Chinese Academy of Sciences has reported a solution with an asteroid sample mass of 328 tons, which is ranked first in the competition. In this article, we will present our design and optimization methods, primarily including overall analysis, target selection, escape from and capture by the Earth-Moon system, and optimization of impulsive and low-thrust trajectories that are modeled in multi-body dynamics. The orbital resonance concept and lunar gravity assists are considered key techniques employed for trajectory design. The reported solution, preliminarily revealing the feasibility of returning a hundreds-of-tons asteroid or asteroid sample, envisions future space missions relating to near-Earth asteroid exploration.
文摘Sample size is very important in statistical research because it is not too small or too large. Given significant level α, the sample size is calculated based on the z-value and pre-defined error. Such error is defined based on the previous experiment or other study or it can be determined subjectively by specialist, which may cause incorrect estimation. Therefore, this research proposes an objective method to estimate the sample size without pre-defining the error. Given an available sample X = {X1, X2, ..., Xn}, the error is calculated via the iterative process in which sample X is re-sampled many times. Moreover, after the sample size is estimated completely, it can be used to collect a new sample in order to estimate new sample size and so on.
文摘Sample size justification is a very crucial part in the design of clinical trials. In this paper, the authors derive a new formula to calculate the sample size for a binary outcome given one of the three popular indices of risk difference.The sample size based on the absolute difference is the fundamental one, which can be easily used to derive sample size given the risk ratio or OR.
文摘The size and form of sampling units-SU have always been variables considered in planning and structuring forest inventories, being performed in forests or in plantations. The experimental work outlined to deal with the problem was conducted in an area of araucaria forest, in Sao Joao de Triunfo, PR, Brazil. The forms of sampling units circle, square and rectangular were evaluated, whose areas ranged from 200 m2 to 1000 m2. Time was recorded using a stopwatch and computed separately for locomotion and measurement. The power model was used to adjust the relations of the times of locomotion and measurement, as well as a hyperbolic one for the coefficient of variation, all taking as function of the SUs sizes. To achieve the analytical solution for the optimum size of the SU, it was necessary simulating the behavior of the three functions until the size of 10,000 m2. By taking the derivative of the combined function it was found the maximum point, which allowed optimizing the size of the SU in 600 m2 for the structured experimental conditions. This result proved the formulated hypothesis, performing even a critical analysis of the inclusion of other relevant variables, such as size of the area to be inventoried, number of SUs performed in one day of work, average distance between the SUs, average speeds for locomotion between SUs and for taking all respective measurements.