Commonly used statistical procedure to describe the observed statistical sets is to use their conventional moments or cumulants. When choosing an appropriate parametric distribution for the data set is typically that ...Commonly used statistical procedure to describe the observed statistical sets is to use their conventional moments or cumulants. When choosing an appropriate parametric distribution for the data set is typically that parameters of a parametric distribution are estimated using the moment method of creating a system of equations in which the sample conventional moments lay in the equality of the corresponding moments of the theoretical distribution. However, the moment method of parameter estimation is not always convenient, especially for small samples. An alternative approach is based on the use of other characteristics, which the author calls L-moments. L-moments are analogous to conventional moments, but they are based on linear combinations of order statistics, i.e., L-statistics. Using L-moments is theoretically preferable to the conventional moments and consists in the fact that L-moments characterize a wider range of distribution. When estimating from sample L-moments, L-moments are more robust to the presence of outliers in the data. Experience also shows that, compared to conventional moments, L-moments are less prone to bias of estimation. Parameter estimates obtained using L-moments are mainly in the case of small samples often even more accurate than estimates of parameters made by maximum likelihood method. Using the method of L-moments in the case of small data sets from the meteorology is primarily known in statistical literature. This paper deals with the use of L-moments in the case for large data sets of income distribution (individual data) and wage distribution (data are ordered to form of interval frequency distribution of extreme open intervals). This paper also presents a comparison of the accuracy of the method of L-moments with an accuracy of other methods of point estimation of parameters of parametric probability distribution in the case of large data sets of individual data and data ordered to form of interval frequency distribution.展开更多
Moments and cumulants are commonly used to characterize the probability distribution or observed data set. The use of the moment method of parameter estimation is also common in the construction of an appropriate para...Moments and cumulants are commonly used to characterize the probability distribution or observed data set. The use of the moment method of parameter estimation is also common in the construction of an appropriate parametric distribution for a certain data set. The moment method does not always produce satisfactory results. It is difficult to determine exactly what information concerning the shape of the distribution is expressed by its moments of the third and higher order. In the case of small samples in particular, numerical values of sample moments can be very different from the corresponding values of theoretical moments of the relevant probability distribution from which the random sample comes. Parameter estimations of the probability distribution made by the moment method are often considerably less accurate than those obtained using other methods, particularly in the case of small samples. The present paper deals with an alternative approach to the construction of an appropriate parametric distribution for the considered data set using order statistics.展开更多
Variance is one of themost important measures of descriptive statistics and commonly used for statistical analysis.The traditional second-order central moment based variance estimation is a widely utilized methodology...Variance is one of themost important measures of descriptive statistics and commonly used for statistical analysis.The traditional second-order central moment based variance estimation is a widely utilized methodology.However,traditional variance estimator is highly affected in the presence of extreme values.So this paper initially,proposes two classes of calibration estimators based on an adaptation of the estimators recently proposed by Koyuncu and then presents a new class of L-Moments based calibration variance estimators utilizing L-Moments characteristics(L-location,Lscale,L-CV)and auxiliary information.It is demonstrated that the proposed L-Moments based calibration variance estimators are more efficient than adapted ones.Artificial data is considered for assessing the performance of the proposed estimators.We also demonstrated an application related to apple fruit for purposes of the article.Using artificial and real data sets,percentage relative efficiency(PRE)of the proposed class of estimators with respect to adapted ones are calculated.The PRE results indicate to the superiority of the proposed class over adapted ones in the presence of extreme values.In this manner,the proposed class of estimators could be applied over an expansive range of survey sampling whenever auxiliary information is available in the presence of extreme values.展开更多
L-moments are defined as linear combinations of probability-weighted moments, They are, virtually unbiased for small samples, and perform well in parameter estimation, choice of the distribution type and regional anal...L-moments are defined as linear combinations of probability-weighted moments, They are, virtually unbiased for small samples, and perform well in parameter estimation, choice of the distribution type and regional analysis. The traditional methods of determining the design wave heights for planning marine structures use data only from the site of interest. Regional frequency analysis gives a new approach to estimate quantile by use of the homogeneous neighborhood informatian. A regional frequency analysis based on L-moments with a case study of the California coast is presented. The significant wave height data for the California coast is offered by NDBC. A 6-site region without 46023 is considered to be a homogeneous region, whose optimal regional distribution is Pearson Ⅲ. The test is conducted by a simulation process. The regional quantile is compared with the at-site quantile, and it is shown that efficient neighborhood information can be used via regional frequency analysis to give a reasonable estimation of the site without enough historical data.展开更多
Changes in the rainfall pattern are a challenge for filling schedule of reservoir, when it is fulfilling various demands. In monsoon fed reservoirs, the target remains for attaining full reservoir capacity in order to...Changes in the rainfall pattern are a challenge for filling schedule of reservoir, when it is fulfilling various demands. In monsoon fed reservoirs, the target remains for attaining full reservoir capacity in order to meet various demands during non-monsoon period and the flood control. The planners always eye towards the inflow trend and perspective frequency of rainfall in order to counter the extreme events. In this study, the case of Hirakud reservoir of Mahanadi basin of India is considered as this reservoir meets various demands as well as controls devastating floods. The inflow trend has been detected by using Mann Kendall test. The frequency analysis of monthly rainfall is calculated using L-moment program for finalizing a regional distribution. The falling trend in inflow to reservoir is visualized in the month of July and August. The Wakeby distribution is found suitable for the monthly rainfall of July, September and October, where as in June and August, General Extreme Value (GEV), General Normal (GN) and Pearson Type-III (PT-III) distributions are found suitable. The regional growth factors for the 20, 40, 50 and 100-year return period rain-falls along with inflow to reservoir observed between 1958-2010 are calculated in this study as a referral for reservoir operation policy.展开更多
Variance is one of the most vital measures of dispersion widely employed in practical aspects.A commonly used approach for variance estimation is the traditional method of moments that is strongly influenced by the pr...Variance is one of the most vital measures of dispersion widely employed in practical aspects.A commonly used approach for variance estimation is the traditional method of moments that is strongly influenced by the presence of extreme values,and thus its results cannot be relied on.Finding momentum from Koyuncu’s recent work,the present paper focuses first on proposing two classes of variance estimators based on linear moments(L-moments),and then employing them with auxiliary data under double stratified sampling to introduce a new class of calibration variance estimators using important properties of L-moments(L-location,L-cv,L-variance).Three populations are taken into account to assess the efficiency of the new estimators.The first and second populations are concerned with artificial data,and the third populations is concerned with real data.The percentage relative efficiency of the proposed estimators over existing ones is evaluated.In the presence of extreme values,our findings depict the superiority and high efficiency of the proposed classes over traditional classes.Hence,when auxiliary data is available along with extreme values,the proposed classes of estimators may be implemented in an extensive variety of sampling surveys.展开更多
通过运用带宽非参数方法、AR-GARCH模型对时间序列的条件均值、条件波动性进行建模估计出标准残差序列,再运用L-Moment与MLE(maximum Likelihood estimation)估计标准残差的尾部的GPD参数,进而运用实验方法测度出风险VaR(value at Risk)...通过运用带宽非参数方法、AR-GARCH模型对时间序列的条件均值、条件波动性进行建模估计出标准残差序列,再运用L-Moment与MLE(maximum Likelihood estimation)估计标准残差的尾部的GPD参数,进而运用实验方法测度出风险VaR(value at Risk)及ES(ExpectedShortfall),最后运用Back-Testing方法检验测度准确性。结果表明,基于带宽的非参数估计模型比GARCH簇模型在测度ES上具有更高的可靠性;基于非参数模型与L-Moment的风险测度模型能够有效测度沪深股市的动态VaR与ES。展开更多
文摘Commonly used statistical procedure to describe the observed statistical sets is to use their conventional moments or cumulants. When choosing an appropriate parametric distribution for the data set is typically that parameters of a parametric distribution are estimated using the moment method of creating a system of equations in which the sample conventional moments lay in the equality of the corresponding moments of the theoretical distribution. However, the moment method of parameter estimation is not always convenient, especially for small samples. An alternative approach is based on the use of other characteristics, which the author calls L-moments. L-moments are analogous to conventional moments, but they are based on linear combinations of order statistics, i.e., L-statistics. Using L-moments is theoretically preferable to the conventional moments and consists in the fact that L-moments characterize a wider range of distribution. When estimating from sample L-moments, L-moments are more robust to the presence of outliers in the data. Experience also shows that, compared to conventional moments, L-moments are less prone to bias of estimation. Parameter estimates obtained using L-moments are mainly in the case of small samples often even more accurate than estimates of parameters made by maximum likelihood method. Using the method of L-moments in the case of small data sets from the meteorology is primarily known in statistical literature. This paper deals with the use of L-moments in the case for large data sets of income distribution (individual data) and wage distribution (data are ordered to form of interval frequency distribution of extreme open intervals). This paper also presents a comparison of the accuracy of the method of L-moments with an accuracy of other methods of point estimation of parameters of parametric probability distribution in the case of large data sets of individual data and data ordered to form of interval frequency distribution.
文摘Moments and cumulants are commonly used to characterize the probability distribution or observed data set. The use of the moment method of parameter estimation is also common in the construction of an appropriate parametric distribution for a certain data set. The moment method does not always produce satisfactory results. It is difficult to determine exactly what information concerning the shape of the distribution is expressed by its moments of the third and higher order. In the case of small samples in particular, numerical values of sample moments can be very different from the corresponding values of theoretical moments of the relevant probability distribution from which the random sample comes. Parameter estimations of the probability distribution made by the moment method are often considerably less accurate than those obtained using other methods, particularly in the case of small samples. The present paper deals with an alternative approach to the construction of an appropriate parametric distribution for the considered data set using order statistics.
基金The authors are grateful to the Deanship of Scientific Research at King Khalid University,Kingdom of Saudi Arabia for funding this study through the research groups program under project number R.G.P.2/67/41.Ibrahim Mufrah Almanjahie received the grant.
文摘Variance is one of themost important measures of descriptive statistics and commonly used for statistical analysis.The traditional second-order central moment based variance estimation is a widely utilized methodology.However,traditional variance estimator is highly affected in the presence of extreme values.So this paper initially,proposes two classes of calibration estimators based on an adaptation of the estimators recently proposed by Koyuncu and then presents a new class of L-Moments based calibration variance estimators utilizing L-Moments characteristics(L-location,Lscale,L-CV)and auxiliary information.It is demonstrated that the proposed L-Moments based calibration variance estimators are more efficient than adapted ones.Artificial data is considered for assessing the performance of the proposed estimators.We also demonstrated an application related to apple fruit for purposes of the article.Using artificial and real data sets,percentage relative efficiency(PRE)of the proposed class of estimators with respect to adapted ones are calculated.The PRE results indicate to the superiority of the proposed class over adapted ones in the presence of extreme values.In this manner,the proposed class of estimators could be applied over an expansive range of survey sampling whenever auxiliary information is available in the presence of extreme values.
基金This research was financially supported bythe National Natural Science Foundation of China (Grant No.50279028)
文摘L-moments are defined as linear combinations of probability-weighted moments, They are, virtually unbiased for small samples, and perform well in parameter estimation, choice of the distribution type and regional analysis. The traditional methods of determining the design wave heights for planning marine structures use data only from the site of interest. Regional frequency analysis gives a new approach to estimate quantile by use of the homogeneous neighborhood informatian. A regional frequency analysis based on L-moments with a case study of the California coast is presented. The significant wave height data for the California coast is offered by NDBC. A 6-site region without 46023 is considered to be a homogeneous region, whose optimal regional distribution is Pearson Ⅲ. The test is conducted by a simulation process. The regional quantile is compared with the at-site quantile, and it is shown that efficient neighborhood information can be used via regional frequency analysis to give a reasonable estimation of the site without enough historical data.
文摘Changes in the rainfall pattern are a challenge for filling schedule of reservoir, when it is fulfilling various demands. In monsoon fed reservoirs, the target remains for attaining full reservoir capacity in order to meet various demands during non-monsoon period and the flood control. The planners always eye towards the inflow trend and perspective frequency of rainfall in order to counter the extreme events. In this study, the case of Hirakud reservoir of Mahanadi basin of India is considered as this reservoir meets various demands as well as controls devastating floods. The inflow trend has been detected by using Mann Kendall test. The frequency analysis of monthly rainfall is calculated using L-moment program for finalizing a regional distribution. The falling trend in inflow to reservoir is visualized in the month of July and August. The Wakeby distribution is found suitable for the monthly rainfall of July, September and October, where as in June and August, General Extreme Value (GEV), General Normal (GN) and Pearson Type-III (PT-III) distributions are found suitable. The regional growth factors for the 20, 40, 50 and 100-year return period rain-falls along with inflow to reservoir observed between 1958-2010 are calculated in this study as a referral for reservoir operation policy.
基金The authors thank the Deanship of Scientific Research at King Khalid University,Kingdom of Saudi Arabia for funding this study through the research groups program under Project Number R.G.P.1/64/42.Ishfaq Ahmad and Ibrahim Mufrah Almanjahie received the grant.
文摘Variance is one of the most vital measures of dispersion widely employed in practical aspects.A commonly used approach for variance estimation is the traditional method of moments that is strongly influenced by the presence of extreme values,and thus its results cannot be relied on.Finding momentum from Koyuncu’s recent work,the present paper focuses first on proposing two classes of variance estimators based on linear moments(L-moments),and then employing them with auxiliary data under double stratified sampling to introduce a new class of calibration variance estimators using important properties of L-moments(L-location,L-cv,L-variance).Three populations are taken into account to assess the efficiency of the new estimators.The first and second populations are concerned with artificial data,and the third populations is concerned with real data.The percentage relative efficiency of the proposed estimators over existing ones is evaluated.In the presence of extreme values,our findings depict the superiority and high efficiency of the proposed classes over traditional classes.Hence,when auxiliary data is available along with extreme values,the proposed classes of estimators may be implemented in an extensive variety of sampling surveys.
文摘通过运用带宽非参数方法、AR-GARCH模型对时间序列的条件均值、条件波动性进行建模估计出标准残差序列,再运用L-Moment与MLE(maximum Likelihood estimation)估计标准残差的尾部的GPD参数,进而运用实验方法测度出风险VaR(value at Risk)及ES(ExpectedShortfall),最后运用Back-Testing方法检验测度准确性。结果表明,基于带宽的非参数估计模型比GARCH簇模型在测度ES上具有更高的可靠性;基于非参数模型与L-Moment的风险测度模型能够有效测度沪深股市的动态VaR与ES。