In this paper we propose an absolute error loss EB estimator for parameter of one-side truncation distribution families. Under some conditions we have proved that the convergence rates of its Bayes risk is o, where 0&...In this paper we propose an absolute error loss EB estimator for parameter of one-side truncation distribution families. Under some conditions we have proved that the convergence rates of its Bayes risk is o, where 0<λ,r≤1,Mn≤lnln n (for large n),Mn→∞ as n→∞.展开更多
In the constant-stress accelerated life test, estimation issues are discussed for a generalized half-normal distribution under a log-linear life-stress model. The maximum likelihood estimates with the corresponding fi...In the constant-stress accelerated life test, estimation issues are discussed for a generalized half-normal distribution under a log-linear life-stress model. The maximum likelihood estimates with the corresponding fixed point type iterative algorithm for unknown parameters are presented, and the least square estimates of the parameters are also proposed. Meanwhile, confidence intervals of model parameters are constructed by using the asymptotic theory and bootstrap technique. Numerical illustration is given to investigate the performance of our methods.展开更多
In a test of the weak equivalence principle (WEP) with a rotating torsion pendulum, it is important to estimate the amplitude of the modulation signal with high precision. We use a torsional filter to remove the fre...In a test of the weak equivalence principle (WEP) with a rotating torsion pendulum, it is important to estimate the amplitude of the modulation signal with high precision. We use a torsional filter to remove the free oscillation signal and employ the correlation method to estimate the amplitude of the modulation signal. The data analysis of an experiment shows that the uncertainties of amplitude components of the modulation signal obtained by the correlation method are in agreement with those due to white noise. The power spectral density of the modulation signal obtained by the correlation method is about one order higher than the thermal noise limit. It indicates that the correlation method is an effective way to estimate the amplitude of the modulation signal and it is instructive to conduct a high-accuracy WEP test.展开更多
Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying th...Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying the model of GPS signal in white Gaussian noise,It is proved that the test statistic follows central or noncentral F distribution,It is also pointed out that the test statistic is nearly identical to central or noncentral chi-squared distribution because the processing samples are large enough to be considered as infinite in GPS acquisition problem.It is also proved that the probability of false alarm,the probability of detection and the threshold are affected largely when the hypothesis testing refers to the full pseudorandom noise(PRN) code phase and Doppler frequency search space cells instead of each individual cell.The performance of the test statistic is also given with combining the noncoherent integration.展开更多
In this paper, we devote to constructing the one-sided empirical Bayes(EB) test for the location parameter in the Gamma distribution by nonparametric method. Under some mild conditions, we prove that the EB test is as...In this paper, we devote to constructing the one-sided empirical Bayes(EB) test for the location parameter in the Gamma distribution by nonparametric method. Under some mild conditions, we prove that the EB test is asymptotically optimal with the rate of the order O(n^(-δs/(2s+1))), where 1/2 ≤ δ < 1 and s > 1 is a given natural number. An example is also given to illustrate that the conditions of the main theorems are easily satisfied.展开更多
Background: Bivariate count data are commonly encountered in medicine, biology, engineering, epidemiology and many other applications. The Poisson distribution has been the model of choice to analyze such data. In mos...Background: Bivariate count data are commonly encountered in medicine, biology, engineering, epidemiology and many other applications. The Poisson distribution has been the model of choice to analyze such data. In most cases mutual independence among the variables is assumed, however this fails to take into accounts the correlation between the outcomes of interests. A special bivariate form of the multivariate Lagrange family of distribution, names Generalized Bivariate Poisson Distribution, is considered in this paper. Objectives: We estimate the model parameters using the method of maximum likelihood and show that the model fits the count variables representing components of metabolic syndrome in spousal pairs. We use the likelihood local score to test the significance of the correlation between the counts. We also construct confidence interval on the ratio of the two correlated Poisson means. Methods: Based on a random sample of pairs of count data, we show that the score test of independence is locally most powerful. We also provide a formula for sample size estimation for given level of significance and given power. The confidence intervals on the ratio of correlated Poisson means are constructed using the delta method, the Fieller’s theorem, and the nonparametric bootstrap. We illustrate the methodologies on metabolic syndrome data collected from 4000 spousal pairs. Results: The bivariate Poisson model fitted the metabolic syndrome data quite satisfactorily. Moreover, the three methods of confidence interval estimation were almost identical, meaning that they have the same interval width.展开更多
It remains challenging to effectively estimate the remaining capacity of the secondary lithium-ion batteries that have been widely adopted for consumer electronics,energy storage,and electric vehicles.Herein,by integr...It remains challenging to effectively estimate the remaining capacity of the secondary lithium-ion batteries that have been widely adopted for consumer electronics,energy storage,and electric vehicles.Herein,by integrating regular real-time current short pulse tests with data-driven Gaussian process regression algorithm,an efficient battery estimation has been successfully developed and validated for batteries with capacity ranging from 100%of the state of health(SOH)to below 50%,reaching an average accuracy as high as 95%.Interestingly,the proposed pulse test strategy for battery capacity measurement could reduce test time by more than 80%compared with regular long charge/discharge tests.The short-term features of the current pulse test were selected for an optimal training process.Data at different voltage stages and state of charge(SOC)are collected and explored to find the most suitable estimation model.In particular,we explore the validity of five different machine-learning methods for estimating capacity driven by pulse features,whereas Gaussian process regression with Matern kernel performs the best,providing guidance for future exploration.The new strategy of combining short pulse tests with machine-learning algorithms could further open window for efficiently forecasting lithium-ion battery remaining capacity.展开更多
Software testing has become a primary business for a number of IT services companies, and estimation, which remains a challenge in software development, is even more challenging in software testing. This paper present...Software testing has become a primary business for a number of IT services companies, and estimation, which remains a challenge in software development, is even more challenging in software testing. This paper presents an overview of software test estimation techniques surveyed, as well as some of the challenges that need to be overcome if the foundations of these software testing estimation techniques are to be improved.展开更多
The International Software Benchmarking and Standards Group (ISBSG) data-base was used to build estimation models for estimating software functional test effort. The analysis of the data revealed three test productivi...The International Software Benchmarking and Standards Group (ISBSG) data-base was used to build estimation models for estimating software functional test effort. The analysis of the data revealed three test productivity patterns representing economies or diseconomies of scale and these patterns served as a basis for investigating the characteristics of the corresponding projects. Three groups of projects related to the three different productivity patterns, characterized by domain, team size, elapsed time and rigor of verification and validation carried out during development, were found to be statistically significant. Within each project group, the variations in test effort can be explained, in addition to functional size, by 1) the processes executed during development, and 2) the processes adopted for testing. Portfolios of estimation models were built using combinations of the three independent variables. Performance of the estimation models built using the function point method innovated by the Common Software Measurement International Consortium (COSMIC) known as COSMIC Function Points, and the one advocated by the International Function Point Users Group (IFPUG) known as IFPUG Function Points, were compared to evaluate the impact of these respective sizing methods on test effort estimation.展开更多
In software industry the major problem encountered during project scheduling is in deciding what proportion of the resources has allocated to the testing phase. In general it has been observed that about 40%-50% of th...In software industry the major problem encountered during project scheduling is in deciding what proportion of the resources has allocated to the testing phase. In general it has been observed that about 40%-50% of the resources need to be allocated to the testing phase. However it is very difficult to predict the exact amount of effort required to be allocated to testing phase. As a result the project planning goes haywire. The project which has not been tested sufficiently can cause huge losses to the organization. This research paper focuses on finding a method which gives a measure of the effort to be spent on the testing phase. This paper provides effort estimates during pre-coding and post-coding phases using neural network to predict more accurately.展开更多
In order to obtain the life information of the vacuum fluorescent display (VFD) in a short time, a model of constant stress accelerated life tests (CSALT) is established with its filament temperature increased, an...In order to obtain the life information of the vacuum fluorescent display (VFD) in a short time, a model of constant stress accelerated life tests (CSALT) is established with its filament temperature increased, and four constant stress tests are conducted. The Weibull function is applied to describe the life distribution of the VFD, and the maximum likelihood estimation (MLE) and its iterative flow chart are used to calculate the shape parameters and the scale parameters. Furthermore, the accelerated life equation is determined by the least square method, the Kolmogorov-Smirnov test is performed to verify whether the VFD life meets the Weibull distribution or not, and selfdeveloped software is employed to predict the average life and the reliable life. Statistical data analysis results demonstrate that the test plans are feasible and versatile, that the VFD life follows the Weibull distribution, and that the VFD accelerated model satisfies the linear Arrhenius equation. The proposed method and the estimated life information of the VFD can provide some significant guideline to its manufacturers and customers.展开更多
To estimate percentiles of a response distribution, the transformed response rule of Wetherill and Robbins-Monro sequential design were proposed under Log-Logistic model. Based on responses data, a necessary and suffi...To estimate percentiles of a response distribution, the transformed response rule of Wetherill and Robbins-Monro sequential design were proposed under Log-Logistic model. Based on responses data, a necessary and sufficient condition for the existence of maximum likelihood estimators and then the calculating formula were presented. After a simulation study, the proposed approach was applied to 65# detonator. Numerical results showed that estimators of percentiles from the proposed approach are robust to the parametric models lacking information on the original response distribution.展开更多
Let fn be a non-parametric kernel density estimator based on a kernel function K. and a sequence of independent and identically distributed random variables taking values in R. The goal of this article is to prove mod...Let fn be a non-parametric kernel density estimator based on a kernel function K. and a sequence of independent and identically distributed random variables taking values in R. The goal of this article is to prove moderate deviations and large deviations for the statistic sup |fn(x) - fn(-x) |.展开更多
Proposed by the Swedish engineer and mathematician Ernst Hjalmar Waloddi Weibull (1887-1979), the Weibull distribution is a probability distribution that is widely used to model lifetime data. Because of its flexibili...Proposed by the Swedish engineer and mathematician Ernst Hjalmar Waloddi Weibull (1887-1979), the Weibull distribution is a probability distribution that is widely used to model lifetime data. Because of its flexibility, some modifications of the Weibull distribution have been made from several researches in order to best adjust the non-monotonic shapes. This paper gives a study on the performance of two specific modifications of the Weibull distribution which are the exponentiated Weibull distribution and the additive Weibull distribution.展开更多
Ridge type estimators are used to estimate regression parameters in a multiple linear regression model when multicolinearity exists among predictor variables. When different estimators are available, preliminary test ...Ridge type estimators are used to estimate regression parameters in a multiple linear regression model when multicolinearity exists among predictor variables. When different estimators are available, preliminary test estimation procedure is adopted to select a suitable estimator. In this paper, two ridge estimators, the Stochastic Restricted Liu Estimator and Liu Estimator are combined to define a new preliminary test estimator, namely the Preliminary Test Stochastic Restricted Liu Estimator (PTSRLE). The stochastic properties of the proposed estimator are derived, and the performance of PTSRLE is compared with SRLE in the sense of mean square error matrix (MSEM) and scalar mean square error (SMSE) for the two cases in which the stochastic restrictions are correct and not correct. Moreover the SMSE of PTSRLE based on Wald (WA), Likelihood Ratio (LR) and Lagrangian Multiplier (LM) tests are derived, and the performance of PTSRLE is compared using WA, LR and LM tests as a function of the shrinkage parameter d with respect to the SMSE. Finally a numerical example is given to illustrate some of the theoretical findings.展开更多
The parameter estimation problem for an economic model called Constantinides-Ingersoll model is investigated based on discrete observations. Euler-Maruyama scheme and iterative method are applied to getting the joint ...The parameter estimation problem for an economic model called Constantinides-Ingersoll model is investigated based on discrete observations. Euler-Maruyama scheme and iterative method are applied to getting the joint conditional probability density function. The maximum likelihood technique is employed for obtaining the parameter estimators and the explicit expressions of the estimation error are given. The strong consistency properties of the estimators are proved by using the law of large numbers for martingales and the strong law of large numbers. The asymptotic normality of the estimation error for the diffusion parameter is obtained with the help of the strong law of large numbers and central-limit theorem. The simulation for the absolute error between estimators and true values is given and the hypothesis testing is made to verify the effectiveness of the estimators.展开更多
In this paper we compare recently developed preliminary test estimator called Preliminary Test Stochastic Restricted Liu Estimator (PTSRLE) with Ordinary Least Square Estimator (OLSE) and Mixed Estimator (ME) in the M...In this paper we compare recently developed preliminary test estimator called Preliminary Test Stochastic Restricted Liu Estimator (PTSRLE) with Ordinary Least Square Estimator (OLSE) and Mixed Estimator (ME) in the Mean Square Error Matrix (MSEM) sense for the two cases in which the stochastic restrictions are correct and not correct. Finally a numerical example and a Monte Carlo simulation study are done to illustrate the theoretical findings.展开更多
Identifying underground utilities and predicting their depth are fundamental when it comes to civil engineering excavations, for example, to install or repair water, sewer, gas, electric systems and others. The accide...Identifying underground utilities and predicting their depth are fundamental when it comes to civil engineering excavations, for example, to install or repair water, sewer, gas, electric systems and others. The accidental rupture of these systems can lead to unplanned repair costs, delays in completing the service, and risk injury or death of workers. One way to detect underground utilities is using the GPR-Ground Penetrating Radar geophysical method. To estimate depth, the travel time (two-way travel time) information provided by a radargram is used in conjunction with ground wave velocity, which depends on the dielectric constant of materials, where it is usually assumed to be constant for the area under investigation. This procedure provides satisfactory results in most cases. However, wrong depth estimates can result in damage to public utilities, rupturing pipes, cutting lines and so on. These cases occur mainly in areas that have a marked variation of water content and/or soil lithology, thus greater care is required to determine the depth of the targets. The present work demonstrates how the interval velocity of Dix (1955) can be applied in radargram to estimate the depth of underground utilities compared to the conventional technique of constant velocity applied to the same data set. To accomplish this, synthetic and real GPR data were used to verify the applicability of the interval velocity technique and to determine the accuracy of the depth estimates obtained. The studies were carried out at the IAG/USP test site, a controlled environment, where metallic drums are buried in known positions and depths allowing the comparison of real to estimated depths. Numerical studies were also carried out aiming to simulate the real environment with variation of dielectric constant in depth and to validate the results with real data. The results showed that the depths of the targets were estimated more accurately by means of the interval velocity technique in contrast to the constant velocity technique, minimizing the risks of accidents during excavation.展开更多
Traditionally, it is widely accepted that measurement error usually obeys the normal distribution. However, in this paper a new idea is proposed that the error in digitized data which is a major derived data source in...Traditionally, it is widely accepted that measurement error usually obeys the normal distribution. However, in this paper a new idea is proposed that the error in digitized data which is a major derived data source in GIS does not obey the normal distribution but the p-norm distribution with a determinate parameter. Assuming that the error is random and has the same statistical properties, the probability density function of the normal distribution, Laplace distribution and p-norm distribution are derived based on the arithmetic mean axiom, median axiom and p-median axiom, which means that the normal distribution is only one of these distributions but not the least one. Based on this ideal distribution fitness tests such as Skewness and Kurtosis coefficient test, Pearson chi-square chi(2) test and Kolmogorov test for digitized data are conducted. The results show that the error in map digitization obeys the p-norm distribution whose parameter is close to 1.60. A least p-norm estimation and the least square estimation of digitized data are further analyzed, showing that the least p-norm adjustment is better than the least square adjustment for digitized data processing in GIS.展开更多
In this paper, we explore the properties of a positive-part Stein-like estimator which is a stochastically weighted convex combination of a fully correlated parameter model estimator and uncorrelated parameter model e...In this paper, we explore the properties of a positive-part Stein-like estimator which is a stochastically weighted convex combination of a fully correlated parameter model estimator and uncorrelated parameter model estimator in the Random Parameters Logit (RPL) model. The results of our Monte Carlo experiments show that the positive-part Stein-like estimator provides smaller MSE than the pretest estimator in the fully correlated RPL model. Both of them outperform the fully correlated RPL model estimator and provide more accurate information on the share of population putting a positive or negative value on the alternative attributes than the fully correlated RPL model estimates. The Monte Carlo mean estimates of direct elasticity with pretest and positive-part Stein-like estimators are closer to the true value and have smaller standard errors than those with fully correlated RPL model estimator.展开更多
文摘In this paper we propose an absolute error loss EB estimator for parameter of one-side truncation distribution families. Under some conditions we have proved that the convergence rates of its Bayes risk is o, where 0<λ,r≤1,Mn≤lnln n (for large n),Mn→∞ as n→∞.
基金supported by the National Natural Science Foundation of China(1150143371473187)the Natural Science Basic Research Plan in Shaanxi Province of China(2016JQ1014)
文摘In the constant-stress accelerated life test, estimation issues are discussed for a generalized half-normal distribution under a log-linear life-stress model. The maximum likelihood estimates with the corresponding fixed point type iterative algorithm for unknown parameters are presented, and the least square estimates of the parameters are also proposed. Meanwhile, confidence intervals of model parameters are constructed by using the asymptotic theory and bootstrap technique. Numerical illustration is given to investigate the performance of our methods.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.11575160,91636221,and 11605065)
文摘In a test of the weak equivalence principle (WEP) with a rotating torsion pendulum, it is important to estimate the amplitude of the modulation signal with high precision. We use a torsional filter to remove the free oscillation signal and employ the correlation method to estimate the amplitude of the modulation signal. The data analysis of an experiment shows that the uncertainties of amplitude components of the modulation signal obtained by the correlation method are in agreement with those due to white noise. The power spectral density of the modulation signal obtained by the correlation method is about one order higher than the thermal noise limit. It indicates that the correlation method is an effective way to estimate the amplitude of the modulation signal and it is instructive to conduct a high-accuracy WEP test.
文摘Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying the model of GPS signal in white Gaussian noise,It is proved that the test statistic follows central or noncentral F distribution,It is also pointed out that the test statistic is nearly identical to central or noncentral chi-squared distribution because the processing samples are large enough to be considered as infinite in GPS acquisition problem.It is also proved that the probability of false alarm,the probability of detection and the threshold are affected largely when the hypothesis testing refers to the full pseudorandom noise(PRN) code phase and Doppler frequency search space cells instead of each individual cell.The performance of the test statistic is also given with combining the noncoherent integration.
基金Supported by the National Natural Science Foundation of China(11671375 and 11471303)Natural Science Foundation of Anhui Provincial Education Department(KJ2017A171)
文摘In this paper, we devote to constructing the one-sided empirical Bayes(EB) test for the location parameter in the Gamma distribution by nonparametric method. Under some mild conditions, we prove that the EB test is asymptotically optimal with the rate of the order O(n^(-δs/(2s+1))), where 1/2 ≤ δ < 1 and s > 1 is a given natural number. An example is also given to illustrate that the conditions of the main theorems are easily satisfied.
文摘Background: Bivariate count data are commonly encountered in medicine, biology, engineering, epidemiology and many other applications. The Poisson distribution has been the model of choice to analyze such data. In most cases mutual independence among the variables is assumed, however this fails to take into accounts the correlation between the outcomes of interests. A special bivariate form of the multivariate Lagrange family of distribution, names Generalized Bivariate Poisson Distribution, is considered in this paper. Objectives: We estimate the model parameters using the method of maximum likelihood and show that the model fits the count variables representing components of metabolic syndrome in spousal pairs. We use the likelihood local score to test the significance of the correlation between the counts. We also construct confidence interval on the ratio of the two correlated Poisson means. Methods: Based on a random sample of pairs of count data, we show that the score test of independence is locally most powerful. We also provide a formula for sample size estimation for given level of significance and given power. The confidence intervals on the ratio of correlated Poisson means are constructed using the delta method, the Fieller’s theorem, and the nonparametric bootstrap. We illustrate the methodologies on metabolic syndrome data collected from 4000 spousal pairs. Results: The bivariate Poisson model fitted the metabolic syndrome data quite satisfactorily. Moreover, the three methods of confidence interval estimation were almost identical, meaning that they have the same interval width.
基金support from Shenzhen Municipal Development and Reform Commission(Grant Number:SDRC[2016]172)Shenzhen Science and Technology Program(Grant No.KQTD20170810150821146)Interdisciplinary Research and Innovation Fund of Tsinghua Shenzhen International Graduate School,and Shanghai Shun Feng Machinery Co.,Ltd.
文摘It remains challenging to effectively estimate the remaining capacity of the secondary lithium-ion batteries that have been widely adopted for consumer electronics,energy storage,and electric vehicles.Herein,by integrating regular real-time current short pulse tests with data-driven Gaussian process regression algorithm,an efficient battery estimation has been successfully developed and validated for batteries with capacity ranging from 100%of the state of health(SOH)to below 50%,reaching an average accuracy as high as 95%.Interestingly,the proposed pulse test strategy for battery capacity measurement could reduce test time by more than 80%compared with regular long charge/discharge tests.The short-term features of the current pulse test were selected for an optimal training process.Data at different voltage stages and state of charge(SOC)are collected and explored to find the most suitable estimation model.In particular,we explore the validity of five different machine-learning methods for estimating capacity driven by pulse features,whereas Gaussian process regression with Matern kernel performs the best,providing guidance for future exploration.The new strategy of combining short pulse tests with machine-learning algorithms could further open window for efficiently forecasting lithium-ion battery remaining capacity.
文摘Software testing has become a primary business for a number of IT services companies, and estimation, which remains a challenge in software development, is even more challenging in software testing. This paper presents an overview of software test estimation techniques surveyed, as well as some of the challenges that need to be overcome if the foundations of these software testing estimation techniques are to be improved.
文摘The International Software Benchmarking and Standards Group (ISBSG) data-base was used to build estimation models for estimating software functional test effort. The analysis of the data revealed three test productivity patterns representing economies or diseconomies of scale and these patterns served as a basis for investigating the characteristics of the corresponding projects. Three groups of projects related to the three different productivity patterns, characterized by domain, team size, elapsed time and rigor of verification and validation carried out during development, were found to be statistically significant. Within each project group, the variations in test effort can be explained, in addition to functional size, by 1) the processes executed during development, and 2) the processes adopted for testing. Portfolios of estimation models were built using combinations of the three independent variables. Performance of the estimation models built using the function point method innovated by the Common Software Measurement International Consortium (COSMIC) known as COSMIC Function Points, and the one advocated by the International Function Point Users Group (IFPUG) known as IFPUG Function Points, were compared to evaluate the impact of these respective sizing methods on test effort estimation.
文摘In software industry the major problem encountered during project scheduling is in deciding what proportion of the resources has allocated to the testing phase. In general it has been observed that about 40%-50% of the resources need to be allocated to the testing phase. However it is very difficult to predict the exact amount of effort required to be allocated to testing phase. As a result the project planning goes haywire. The project which has not been tested sufficiently can cause huge losses to the organization. This research paper focuses on finding a method which gives a measure of the effort to be spent on the testing phase. This paper provides effort estimates during pre-coding and post-coding phases using neural network to predict more accurately.
基金Undergraduate Education High land Construction Project of Shanghaithe Key Course Construction of Shanghai Education Committee (No.20075302)the Key Technology R&D Program of Shanghai Municipality (No.08160510600)
文摘In order to obtain the life information of the vacuum fluorescent display (VFD) in a short time, a model of constant stress accelerated life tests (CSALT) is established with its filament temperature increased, and four constant stress tests are conducted. The Weibull function is applied to describe the life distribution of the VFD, and the maximum likelihood estimation (MLE) and its iterative flow chart are used to calculate the shape parameters and the scale parameters. Furthermore, the accelerated life equation is determined by the least square method, the Kolmogorov-Smirnov test is performed to verify whether the VFD life meets the Weibull distribution or not, and selfdeveloped software is employed to predict the average life and the reliable life. Statistical data analysis results demonstrate that the test plans are feasible and versatile, that the VFD life follows the Weibull distribution, and that the VFD accelerated model satisfies the linear Arrhenius equation. The proposed method and the estimated life information of the VFD can provide some significant guideline to its manufacturers and customers.
文摘To estimate percentiles of a response distribution, the transformed response rule of Wetherill and Robbins-Monro sequential design were proposed under Log-Logistic model. Based on responses data, a necessary and sufficient condition for the existence of maximum likelihood estimators and then the calculating formula were presented. After a simulation study, the proposed approach was applied to 65# detonator. Numerical results showed that estimators of percentiles from the proposed approach are robust to the parametric models lacking information on the original response distribution.
基金Research supported by the National Natural Science Foundation of China (10271091)
文摘Let fn be a non-parametric kernel density estimator based on a kernel function K. and a sequence of independent and identically distributed random variables taking values in R. The goal of this article is to prove moderate deviations and large deviations for the statistic sup |fn(x) - fn(-x) |.
文摘Proposed by the Swedish engineer and mathematician Ernst Hjalmar Waloddi Weibull (1887-1979), the Weibull distribution is a probability distribution that is widely used to model lifetime data. Because of its flexibility, some modifications of the Weibull distribution have been made from several researches in order to best adjust the non-monotonic shapes. This paper gives a study on the performance of two specific modifications of the Weibull distribution which are the exponentiated Weibull distribution and the additive Weibull distribution.
文摘Ridge type estimators are used to estimate regression parameters in a multiple linear regression model when multicolinearity exists among predictor variables. When different estimators are available, preliminary test estimation procedure is adopted to select a suitable estimator. In this paper, two ridge estimators, the Stochastic Restricted Liu Estimator and Liu Estimator are combined to define a new preliminary test estimator, namely the Preliminary Test Stochastic Restricted Liu Estimator (PTSRLE). The stochastic properties of the proposed estimator are derived, and the performance of PTSRLE is compared with SRLE in the sense of mean square error matrix (MSEM) and scalar mean square error (SMSE) for the two cases in which the stochastic restrictions are correct and not correct. Moreover the SMSE of PTSRLE based on Wald (WA), Likelihood Ratio (LR) and Lagrangian Multiplier (LM) tests are derived, and the performance of PTSRLE is compared using WA, LR and LM tests as a function of the shrinkage parameter d with respect to the SMSE. Finally a numerical example is given to illustrate some of the theoretical findings.
基金National Nature Science Foundation of China(No.60974030)the Chinese Universities Scientific Fund(No.CUSF-DH-D-2014059)
文摘The parameter estimation problem for an economic model called Constantinides-Ingersoll model is investigated based on discrete observations. Euler-Maruyama scheme and iterative method are applied to getting the joint conditional probability density function. The maximum likelihood technique is employed for obtaining the parameter estimators and the explicit expressions of the estimation error are given. The strong consistency properties of the estimators are proved by using the law of large numbers for martingales and the strong law of large numbers. The asymptotic normality of the estimation error for the diffusion parameter is obtained with the help of the strong law of large numbers and central-limit theorem. The simulation for the absolute error between estimators and true values is given and the hypothesis testing is made to verify the effectiveness of the estimators.
文摘In this paper we compare recently developed preliminary test estimator called Preliminary Test Stochastic Restricted Liu Estimator (PTSRLE) with Ordinary Least Square Estimator (OLSE) and Mixed Estimator (ME) in the Mean Square Error Matrix (MSEM) sense for the two cases in which the stochastic restrictions are correct and not correct. Finally a numerical example and a Monte Carlo simulation study are done to illustrate the theoretical findings.
文摘Identifying underground utilities and predicting their depth are fundamental when it comes to civil engineering excavations, for example, to install or repair water, sewer, gas, electric systems and others. The accidental rupture of these systems can lead to unplanned repair costs, delays in completing the service, and risk injury or death of workers. One way to detect underground utilities is using the GPR-Ground Penetrating Radar geophysical method. To estimate depth, the travel time (two-way travel time) information provided by a radargram is used in conjunction with ground wave velocity, which depends on the dielectric constant of materials, where it is usually assumed to be constant for the area under investigation. This procedure provides satisfactory results in most cases. However, wrong depth estimates can result in damage to public utilities, rupturing pipes, cutting lines and so on. These cases occur mainly in areas that have a marked variation of water content and/or soil lithology, thus greater care is required to determine the depth of the targets. The present work demonstrates how the interval velocity of Dix (1955) can be applied in radargram to estimate the depth of underground utilities compared to the conventional technique of constant velocity applied to the same data set. To accomplish this, synthetic and real GPR data were used to verify the applicability of the interval velocity technique and to determine the accuracy of the depth estimates obtained. The studies were carried out at the IAG/USP test site, a controlled environment, where metallic drums are buried in known positions and depths allowing the comparison of real to estimated depths. Numerical studies were also carried out aiming to simulate the real environment with variation of dielectric constant in depth and to validate the results with real data. The results showed that the depths of the targets were estimated more accurately by means of the interval velocity technique in contrast to the constant velocity technique, minimizing the risks of accidents during excavation.
文摘Traditionally, it is widely accepted that measurement error usually obeys the normal distribution. However, in this paper a new idea is proposed that the error in digitized data which is a major derived data source in GIS does not obey the normal distribution but the p-norm distribution with a determinate parameter. Assuming that the error is random and has the same statistical properties, the probability density function of the normal distribution, Laplace distribution and p-norm distribution are derived based on the arithmetic mean axiom, median axiom and p-median axiom, which means that the normal distribution is only one of these distributions but not the least one. Based on this ideal distribution fitness tests such as Skewness and Kurtosis coefficient test, Pearson chi-square chi(2) test and Kolmogorov test for digitized data are conducted. The results show that the error in map digitization obeys the p-norm distribution whose parameter is close to 1.60. A least p-norm estimation and the least square estimation of digitized data are further analyzed, showing that the least p-norm adjustment is better than the least square adjustment for digitized data processing in GIS.
文摘In this paper, we explore the properties of a positive-part Stein-like estimator which is a stochastically weighted convex combination of a fully correlated parameter model estimator and uncorrelated parameter model estimator in the Random Parameters Logit (RPL) model. The results of our Monte Carlo experiments show that the positive-part Stein-like estimator provides smaller MSE than the pretest estimator in the fully correlated RPL model. Both of them outperform the fully correlated RPL model estimator and provide more accurate information on the share of population putting a positive or negative value on the alternative attributes than the fully correlated RPL model estimates. The Monte Carlo mean estimates of direct elasticity with pretest and positive-part Stein-like estimators are closer to the true value and have smaller standard errors than those with fully correlated RPL model estimator.