The particle size distribution of rockfill is studied by using granular mechanics, mesomechanics and probability statistics to reveal the relationship of the distribution of particle size to that of the potential ener...The particle size distribution of rockfill is studied by using granular mechanics, mesomechanics and probability statistics to reveal the relationship of the distribution of particle size to that of the potential energy intensity before fragmentation, which finds out that the potential energy density has a linear relation to the logarithm of particle size and deduces that the distribution of the logarithm of particle size conforms to normal distribution because the distribution of the potential energy density does so. Based on this finding and by including the energy principle of rock fragmentation, the logarithm distribution model of particle size is formulated, which uncovers the natural characteristics of particle sizes on statistical distribution. Exploring the properties of the average value, the expectation, and the unbiased variance of particle size indicates that the expectation does notequal to the average value, but increases with increasing particle size and its ununiformity, and is always larger than the average value, and the unbiased variance increases as the ununiformity and geometric average value increase. A case study proves that the simulated results by the proposed logarithm distribution model accord with the actual data. It is concluded that the logarithm distribution model and Kuz-Ram model can be used to forecast the particle-size distribution of inartificial rockfill while for blasted rockfill, Kuz-Ram model is an option, and in combined application of the two models, it is necessary to do field tests to adjust some parameters of the model.展开更多
The use of probability distribution functions for describing tree diameter at breast height provides useful information for forest resource evaluation and quantification. A series of probability distribution functions...The use of probability distribution functions for describing tree diameter at breast height provides useful information for forest resource evaluation and quantification. A series of probability distribution functions have been widely developed and applied for managing forest trees in conventional forest reserves without much consideration for trees outside forest reserves. The aim of this study is to evaluate and propose a suitable probability distribution function for trees in Agricultural landscapes. The study examined 3-parameter lognormal, Lognormal, 3-parameter Gamma, Gamma, 3-parameter Weibull and Weibull distribution functions, using the Maximum Likelihood method for fitting tree diameter at breast height. Three hundred and thirty-two temporary farmlands were randomly selected from which stem diameter of all living trees, with diameter ≥ 10.0 cm, were measured. Results of the statistical analysis showed that the 3-parameter lognormal distribution gave a superior description of the stem diameter with the least values of Anderson Darling (1.627) and Akaike Information Criterion (5962.0) statistics. Hence, the 3-parameter lognormal distribution function was found suitable for the stem diameter of trees in Agricultural landscapes in the study area.展开更多
The brittle fracture probability and reliability are obtained in terms of dislocation mechanism of microcrack evolution. The statistical distribution functions and statistical deviations of elongation, strength, plast...The brittle fracture probability and reliability are obtained in terms of dislocation mechanism of microcrack evolution. The statistical distribution functions and statistical deviations of elongation, strength, plastic work, crack extension force, fracture foughness, critical and crack length, can be derived in a unified fashion.展开更多
Statistical analysis was done on simultaneous wave and wind using data recorded by discus-shape wave buoy. The area is located in the southern Caspian Sea near the Anzali Port. Recorded wave data were obtained through...Statistical analysis was done on simultaneous wave and wind using data recorded by discus-shape wave buoy. The area is located in the southern Caspian Sea near the Anzali Port. Recorded wave data were obtained through directional spectrum wave analysis. Recorded wind direction and wind speed were obtained through the related time series as well. For 12-month measurements(May 25 2007-2008), statistical calculations were done to specify the value of nonlinear auto-correlation of wave and wind using the probability distribution function of wave characteristics and statistical analysis in various time periods. The paper also presents and analyzes the amount of wave energy for the area mentioned on the basis of available database. Analyses showed a suitable comparison between the amounts of wave energy in different seasons. As a result, the best period for the largest amount of wave energy was known. Results showed that in the research period, the mean wave and wind auto correlation were about three hours. Among the probability distribution functions, i.e Weibull, Normal, Lognormal and Rayleigh, "Weibull" had the best consistency with experimental distribution function shown in different diagrams for each season. Results also showed that the mean wave energy in the research period was about 49.88 k W/m and the maximum density of wave energy was found in February and March, 2010.展开更多
In the present paper we derived, with direct method, the exact expressions for the sampling probability density function of the Gini concentration ratio for samples from a uniform population of size n = 6, 7, 8, 9 and...In the present paper we derived, with direct method, the exact expressions for the sampling probability density function of the Gini concentration ratio for samples from a uniform population of size n = 6, 7, 8, 9 and 10. Moreover, we found some regularities of such distributions valid for any sample size.展开更多
In modeling reliability data,the exponential distribution is commonly used due to its simplicity.For estimating the parameter of the exponential distribution,classical estimators including maximum likelihood estimator...In modeling reliability data,the exponential distribution is commonly used due to its simplicity.For estimating the parameter of the exponential distribution,classical estimators including maximum likelihood estimator represent the most commonly used method and are well known to be efficient.However,the maximum likelihood estimator is highly sensitive in the presence of contamination or outliers.In this study,a robust and efficient estimator of the exponential distribution parameter was proposed based on the probability integral transform statistic.To examine the robustness of this new estimator,asymptotic variance,breakdown point,and gross error sensitivity were derived.This new estimator offers reasonable protection against outliers besides being simple to compute.Furthermore,a simulation study was conducted to compare the performance of this new estimator with the maximum likelihood estimator,weighted likelihood estimator,and M-scale estimator in the presence of outliers.Finally,a statistical analysis of three reliability data sets was conducted to demonstrate the performance of the proposed estimator.展开更多
Moments and cumulants are commonly used to characterize the probability distribution or observed data set. The use of the moment method of parameter estimation is also common in the construction of an appropriate para...Moments and cumulants are commonly used to characterize the probability distribution or observed data set. The use of the moment method of parameter estimation is also common in the construction of an appropriate parametric distribution for a certain data set. The moment method does not always produce satisfactory results. It is difficult to determine exactly what information concerning the shape of the distribution is expressed by its moments of the third and higher order. In the case of small samples in particular, numerical values of sample moments can be very different from the corresponding values of theoretical moments of the relevant probability distribution from which the random sample comes. Parameter estimations of the probability distribution made by the moment method are often considerably less accurate than those obtained using other methods, particularly in the case of small samples. The present paper deals with an alternative approach to the construction of an appropriate parametric distribution for the considered data set using order statistics.展开更多
The paper deals with the performing of a critical analysis of the problems arising in matching the classical models of the statistical and phenomenological thermodynamics. The performed analysis shows that some concep...The paper deals with the performing of a critical analysis of the problems arising in matching the classical models of the statistical and phenomenological thermodynamics. The performed analysis shows that some concepts of the statistical and phenomenological methods of describing the classical systems do not quite correlate with each other. Particularly, in these methods various caloric ideal gas equations of state are employed, while the possibility existing in the thermodynamic cyclic processes to obtain the same distributions both due to a change of the particle concentration and owing to a change of temperature is not allowed for in the statistical methods. The above-mentioned difference of the equations of state is cleared away when using in the statistical functions corresponding to the canonical Gibbs equations instead of the Planck’s constant a new scale factor that depends on the parameters of a system and coincides with the Planck’s constant in going of the system to the degenerate state. Under such an approach, the statistical entropy is transformed into one of the forms of heat capacity. In its turn, the agreement of the methods under consideration in the question as to the dependence of the molecular distributions on the concentration of particles, apparently, will call for further refinement of the physical model of ideal gas and the techniques for its statistical description.展开更多
This article compares the size of selected subsets using nonparametric subset selection rules with two different scoring rules for the observations. The scoring rules are based on the expected values of order statisti...This article compares the size of selected subsets using nonparametric subset selection rules with two different scoring rules for the observations. The scoring rules are based on the expected values of order statistics of the uniform distribution (yielding rank values) and of the normal distribution (yielding normal score values). The comparison is made using state motor vehicle traffic fatality rates, published in a 2016 article, with fifty-one states (including DC as a state) and over a nineteen-year period (1994 through 2012). The earlier study considered four block design selection rules—two for choosing a subset to contain the “best” population (i.e., state with lowest mean fatality rate) and two for the “worst” population (i.e., highest mean rate) with a probability of correct selection chosen to be 0.90. Two selection rules based on normal scores resulted in selected subset sizes substantially smaller than corresponding rules based on ranks (7 vs. 16 and 3 vs. 12). For two other selection rules, the subsets chosen were very close in size (within one). A comparison is also made using state homicide rates, published in a 2022 article, with fifty states and covering eight years. The results are qualitatively the same as those obtained with the motor vehicle traffic fatality rates.展开更多
基金Chongqing Science and Technology Committee on basic research(No.2001-74-29) and Ministry of communications on Western Communications Construct Research Item(No. 200231800034)
文摘The particle size distribution of rockfill is studied by using granular mechanics, mesomechanics and probability statistics to reveal the relationship of the distribution of particle size to that of the potential energy intensity before fragmentation, which finds out that the potential energy density has a linear relation to the logarithm of particle size and deduces that the distribution of the logarithm of particle size conforms to normal distribution because the distribution of the potential energy density does so. Based on this finding and by including the energy principle of rock fragmentation, the logarithm distribution model of particle size is formulated, which uncovers the natural characteristics of particle sizes on statistical distribution. Exploring the properties of the average value, the expectation, and the unbiased variance of particle size indicates that the expectation does notequal to the average value, but increases with increasing particle size and its ununiformity, and is always larger than the average value, and the unbiased variance increases as the ununiformity and geometric average value increase. A case study proves that the simulated results by the proposed logarithm distribution model accord with the actual data. It is concluded that the logarithm distribution model and Kuz-Ram model can be used to forecast the particle-size distribution of inartificial rockfill while for blasted rockfill, Kuz-Ram model is an option, and in combined application of the two models, it is necessary to do field tests to adjust some parameters of the model.
文摘The use of probability distribution functions for describing tree diameter at breast height provides useful information for forest resource evaluation and quantification. A series of probability distribution functions have been widely developed and applied for managing forest trees in conventional forest reserves without much consideration for trees outside forest reserves. The aim of this study is to evaluate and propose a suitable probability distribution function for trees in Agricultural landscapes. The study examined 3-parameter lognormal, Lognormal, 3-parameter Gamma, Gamma, 3-parameter Weibull and Weibull distribution functions, using the Maximum Likelihood method for fitting tree diameter at breast height. Three hundred and thirty-two temporary farmlands were randomly selected from which stem diameter of all living trees, with diameter ≥ 10.0 cm, were measured. Results of the statistical analysis showed that the 3-parameter lognormal distribution gave a superior description of the stem diameter with the least values of Anderson Darling (1.627) and Akaike Information Criterion (5962.0) statistics. Hence, the 3-parameter lognormal distribution function was found suitable for the stem diameter of trees in Agricultural landscapes in the study area.
文摘The brittle fracture probability and reliability are obtained in terms of dislocation mechanism of microcrack evolution. The statistical distribution functions and statistical deviations of elongation, strength, plastic work, crack extension force, fracture foughness, critical and crack length, can be derived in a unified fashion.
文摘Statistical analysis was done on simultaneous wave and wind using data recorded by discus-shape wave buoy. The area is located in the southern Caspian Sea near the Anzali Port. Recorded wave data were obtained through directional spectrum wave analysis. Recorded wind direction and wind speed were obtained through the related time series as well. For 12-month measurements(May 25 2007-2008), statistical calculations were done to specify the value of nonlinear auto-correlation of wave and wind using the probability distribution function of wave characteristics and statistical analysis in various time periods. The paper also presents and analyzes the amount of wave energy for the area mentioned on the basis of available database. Analyses showed a suitable comparison between the amounts of wave energy in different seasons. As a result, the best period for the largest amount of wave energy was known. Results showed that in the research period, the mean wave and wind auto correlation were about three hours. Among the probability distribution functions, i.e Weibull, Normal, Lognormal and Rayleigh, "Weibull" had the best consistency with experimental distribution function shown in different diagrams for each season. Results also showed that the mean wave energy in the research period was about 49.88 k W/m and the maximum density of wave energy was found in February and March, 2010.
文摘In the present paper we derived, with direct method, the exact expressions for the sampling probability density function of the Gini concentration ratio for samples from a uniform population of size n = 6, 7, 8, 9 and 10. Moreover, we found some regularities of such distributions valid for any sample size.
基金This work is supported by the Universiti Kebangsaan Malaysia[Grant Number DIP-2018-038].
文摘In modeling reliability data,the exponential distribution is commonly used due to its simplicity.For estimating the parameter of the exponential distribution,classical estimators including maximum likelihood estimator represent the most commonly used method and are well known to be efficient.However,the maximum likelihood estimator is highly sensitive in the presence of contamination or outliers.In this study,a robust and efficient estimator of the exponential distribution parameter was proposed based on the probability integral transform statistic.To examine the robustness of this new estimator,asymptotic variance,breakdown point,and gross error sensitivity were derived.This new estimator offers reasonable protection against outliers besides being simple to compute.Furthermore,a simulation study was conducted to compare the performance of this new estimator with the maximum likelihood estimator,weighted likelihood estimator,and M-scale estimator in the presence of outliers.Finally,a statistical analysis of three reliability data sets was conducted to demonstrate the performance of the proposed estimator.
文摘Moments and cumulants are commonly used to characterize the probability distribution or observed data set. The use of the moment method of parameter estimation is also common in the construction of an appropriate parametric distribution for a certain data set. The moment method does not always produce satisfactory results. It is difficult to determine exactly what information concerning the shape of the distribution is expressed by its moments of the third and higher order. In the case of small samples in particular, numerical values of sample moments can be very different from the corresponding values of theoretical moments of the relevant probability distribution from which the random sample comes. Parameter estimations of the probability distribution made by the moment method are often considerably less accurate than those obtained using other methods, particularly in the case of small samples. The present paper deals with an alternative approach to the construction of an appropriate parametric distribution for the considered data set using order statistics.
文摘The paper deals with the performing of a critical analysis of the problems arising in matching the classical models of the statistical and phenomenological thermodynamics. The performed analysis shows that some concepts of the statistical and phenomenological methods of describing the classical systems do not quite correlate with each other. Particularly, in these methods various caloric ideal gas equations of state are employed, while the possibility existing in the thermodynamic cyclic processes to obtain the same distributions both due to a change of the particle concentration and owing to a change of temperature is not allowed for in the statistical methods. The above-mentioned difference of the equations of state is cleared away when using in the statistical functions corresponding to the canonical Gibbs equations instead of the Planck’s constant a new scale factor that depends on the parameters of a system and coincides with the Planck’s constant in going of the system to the degenerate state. Under such an approach, the statistical entropy is transformed into one of the forms of heat capacity. In its turn, the agreement of the methods under consideration in the question as to the dependence of the molecular distributions on the concentration of particles, apparently, will call for further refinement of the physical model of ideal gas and the techniques for its statistical description.
文摘This article compares the size of selected subsets using nonparametric subset selection rules with two different scoring rules for the observations. The scoring rules are based on the expected values of order statistics of the uniform distribution (yielding rank values) and of the normal distribution (yielding normal score values). The comparison is made using state motor vehicle traffic fatality rates, published in a 2016 article, with fifty-one states (including DC as a state) and over a nineteen-year period (1994 through 2012). The earlier study considered four block design selection rules—two for choosing a subset to contain the “best” population (i.e., state with lowest mean fatality rate) and two for the “worst” population (i.e., highest mean rate) with a probability of correct selection chosen to be 0.90. Two selection rules based on normal scores resulted in selected subset sizes substantially smaller than corresponding rules based on ranks (7 vs. 16 and 3 vs. 12). For two other selection rules, the subsets chosen were very close in size (within one). A comparison is also made using state homicide rates, published in a 2022 article, with fifty states and covering eight years. The results are qualitatively the same as those obtained with the motor vehicle traffic fatality rates.