<p> <span style="font-family:Verdana;">To address the drawbacks of the traditional Parker test in multivariate linear</span><span style="font-family:;" "=""> ...<p> <span style="font-family:Verdana;">To address the drawbacks of the traditional Parker test in multivariate linear</span><span style="font-family:;" "=""> </span><span style="font-family:Verdana;">models:</span><span style="font-family:;" "=""> </span><span style="font-family:Verdana;">the process is cumbersome and computationally intensive,</span><span style="font-family:;" "=""> </span><span style="font-family:Verdana;">we propose a new heteroscedasticity test.</span><span style="font-family:;" "=""> </span><span style="font-family:Verdana;">A new heteroskedasticity test is proposed using the fitted values of the samples as new explanatory variables, reconstructing the regression model, and giving a new heteroskedasticity test based on the significance test of the coefficients, it is also compared with the existing Parker test which is improved using the principal component idea. Numerical simulations and empirical analyses show that the improved Parker test with the fitted values of the samples proposed in this paper is superior.</span> </p>展开更多
The presence of heteroskedasticity in a considered regression model may bias the standard deviations of parameters obtained by the Ordinary Least Square (OLS) method. In this case, several hypothesis tests on the mode...The presence of heteroskedasticity in a considered regression model may bias the standard deviations of parameters obtained by the Ordinary Least Square (OLS) method. In this case, several hypothesis tests on the model under consideration may be biased, for example, CHOW’s coefficient stability test (or structural change test), Student’s t-test and Fisher’s F-test. Most of the heteroscedasticity tests in the literature are based on the comparison of variances. Despite the multiplication of equality tests of coefficients of variation (CVs) that have appeared in the literature, to our knowledge, the first and only use of the coefficient of variation in the detection of heteroskedasticity was offered by Li and Yao in 2017. Thus, this paper offers an approach to determine the existence of heteroskedasticity by a test of equality of coefficients of variation. We verify by a Monte Carlo robustness and performance test that our method seems even better than some tests in the literature. The results of this study contribute to the exploitation of the statistical measurement of CV dispersion. They help technicians economists to better verify their hypotheses before making a scientific decision when making a necessary forecast, in order to contribute effectively to the economic and sustainable development of a company or enterprise.展开更多
Unlike height-diameter equations for standing trees commonly used in forest resources modelling,tree height models for cut-to-length(CTL)stems tend to produce prediction errors whose distributions are not conditionall...Unlike height-diameter equations for standing trees commonly used in forest resources modelling,tree height models for cut-to-length(CTL)stems tend to produce prediction errors whose distributions are not conditionally normal but are rather leptokurtic and heavy-tailed.This feature was merely noticed in previous studies but never thoroughly investigated.This study characterized the prediction error distribution of a newly developed such tree height model for Pin us radiata(D.Don)through the three-parameter Burr TypeⅫ(BⅫ)distribution.The model’s prediction errors(ε)exhibited heteroskedasticity conditional mainly on the small end relative diameter of the top log and also on DBH to a minor extent.Structured serial correlations were also present in the data.A total of 14 candidate weighting functions were compared to select the best two for weightingεin order to reduce its conditional heteroskedasticity.The weighted prediction errors(εw)were shifted by a constant to the positive range supported by the BXII distribution.Then the distribution of weighted and shifted prediction errors(εw+)was characterized by the BⅫdistribution using maximum likelihood estimation through 1000 times of repeated random sampling,fitting and goodness-of-fit testing,each time by randomly taking only one observation from each tree to circumvent the potential adverse impact of serial correlation in the data on parameter estimation and inferences.The nonparametric two sample Kolmogorov-Smirnov(KS)goodness-of-fit test and its closely related Kuiper’s(KU)test showed the fitted BⅫdistributions provided a good fit to the highly leptokurtic and heavy-tailed distribution ofε.Random samples generated from the fitted BⅫdistributions ofεw+derived from using the best two weighting functions,when back-shifted and unweighted,exhibited distributions that were,in about97 and 95%of the 1000 cases respectively,not statistically different from the distribution ofε.Our results for cut-tolength P.radiata stems represented the first case of any tree species where a non-normal error distribution in tree height prediction was described by an underlying probability distribution.The fitted BXII prediction error distribution will help to unlock the full potential of the new tree height model in forest resources modelling of P.radiata plantations,particularly when uncertainty assessments,statistical inferences and error propagations are needed in research and practical applications through harvester data analytics.展开更多
To improve the forecasting reliability of travel time, the time-varying confidence interval of travel time on arterials is forecasted using an autoregressive integrated moving average and generalized autoregressive co...To improve the forecasting reliability of travel time, the time-varying confidence interval of travel time on arterials is forecasted using an autoregressive integrated moving average and generalized autoregressive conditional heteroskedasticity (ARIMA-GARCH) model. In which, the ARIMA model is used as the mean equation of the GARCH model to model the travel time levels and the GARCH model is used to model the conditional variances of travel time. The proposed method is validated and evaluated using actual traffic flow data collected from the traffic monitoring system of Kunshan city. The evaluation results show that, compared with the conventional ARIMA model, the proposed model cannot significantly improve the forecasting performance of travel time levels but has advantage in travel time volatility forecasting. The proposed model can well capture the travel time heteroskedasticity and forecast the time-varying confidence intervals of travel time which can better reflect the volatility of observed travel times than the fixed confidence interval provided by the ARIMA model.展开更多
This paper highlights some recent developments in testing predictability of asset returns with focuses on linear mean regressions, quantile regressions and nonlinear regression models. For these models, when predictor...This paper highlights some recent developments in testing predictability of asset returns with focuses on linear mean regressions, quantile regressions and nonlinear regression models. For these models, when predictors are highly persistent and their innovations are contemporarily correlated with dependent variable, the ordinary least squares estimator has a finite-sample bias, and its limiting distribution relies on some unknown nuisance parameter, which is not consistently estimable. Without correcting these issues, conventional test statistics are subject to a serious size distortion and generate a misleading conclusion in testing pre- dictability of asset returns in real applications. In the past two decades, sequential studies have contributed to this subject and proposed various kinds of solutions, including, but not limit to, the bias-correction procedures, the linear projection approach, the IVX filtering idea, the variable addition approaches, the weighted empirical likelihood method, and the double-weight robust approach. Particularly, to catch up with the fast-growing literature in the recent decade, we offer a selective overview of these methods. Finally, some future research topics, such as the econometric theory for predictive regressions with structural changes, and nonparametric predictive models, and predictive models under a more general data setting, are also discussed.展开更多
The aim of the paper is to provide some evidences on relationships among the degree of financial integration, stock exchange markets, and volatility of national market returns. In this paper, the authors employ correl...The aim of the paper is to provide some evidences on relationships among the degree of financial integration, stock exchange markets, and volatility of national market returns. In this paper, the authors employ correlation and cluster analyses in order to investigate the impact of stock exchange consolidation on volatility of market returns, in terms of a financial integration between involved stock exchanges before and after the merger. By using the Generalized Autoregressive Conditional Heteroskedasticity (GARCH) (1.1) model, the authors test the change in volatilities of national stock exchange markets involved in the following stock exchange integration case studies: Euronext, Bolsasy Mercados Espanoles (BME), and Swedish-Finnish financial services company (OMX). These three case studies are considered as completed cases of market consolidation, where the data are available enough to conduct the current research. By using daily data of national returns of engaged European stock markets from 1995 to 2007, the paper investigates the influence of stock exchange consolidation on volatility of national stock market returns. The obtained results confirm the gradual decrease of volatility in each of the integrated stock markets. However, the level of decrease in terms of volatility depends on economic characteristics of each engaged market and its degree of integration with other financial services. The results of correlation and cluster analyses confirm that stock operators have created significantly non-official integration links through cross-memberships and cross-listings even before the consolidations. Thus, the mergers among stock exchanges can be considered as the rational consequences of the high internal co-movements between involved markets. Furthermore, stock exchange markets with strong non-official integration links show an immediate decrease of volatility after the merger, meanwhile for others, it takes several years before the volatility can decrease as markets should reach the full integration.展开更多
This paper uses the estimation of the Self-Excited Multi Fractal (SEMF) model, which holds theoretical promise but has seen mixed results in practice, as a case study to explore the impact of distributional assumption...This paper uses the estimation of the Self-Excited Multi Fractal (SEMF) model, which holds theoretical promise but has seen mixed results in practice, as a case study to explore the impact of distributional assumptions on the model fitting process. In the case of the SEMF model, this examination shows that incorporating reasonable distributional assumptions including a non-zero mean and the leptokurtic Student’s t distribution can have a substantial impact on the estimation results and can mean the difference between parameter estimates that imply unstable and potentially explosive volatility dynamics versus ones that describe more reasonable and realistic dynamics for the returns. While the original SEMF model specification is found to yield unrealistic results for most of the series of financial returns to which it is applied, the results obtained after incorporating the Student’s t distribution and a mean component into the model specification suggest that the SEMF model is a reasonable model, implying realistic return behavior, for most, if not all, of the series of stock and index returns to which it is applied in this study. In addition, reflecting the sensitivity of the sample mean to the types of characteristics that the SEMF model is designed to capture, the results of this study also illustrate the value of incorporating the mean component directly into the model and fitting it in conjunction with the other model parameters rather than simply centering the returns beforehand by subtracting the sample mean from them.展开更多
In the paper, a general framework for large scale modeling of macroeconomic and financial time series is introduced. The proposed approach is characterized by simplicity of implementation, performing well independentl...In the paper, a general framework for large scale modeling of macroeconomic and financial time series is introduced. The proposed approach is characterized by simplicity of implementation, performing well independently of persistence and heteroskedasticity properties, accounting for common deterministic and stochastic factors. Monte Carlo results strongly support the proposed methodology, validating its use also for relatively small cross-sectional and temporal samples.展开更多
To characterize heteroskedasticity,nonlinearity,and asymmetry in tail risk,this study investigates a class of conditional (dynamic) expectile models with partially varying coefficients in which some coefficients are a...To characterize heteroskedasticity,nonlinearity,and asymmetry in tail risk,this study investigates a class of conditional (dynamic) expectile models with partially varying coefficients in which some coefficients are allowed to be constants,but others are allowed to be unknown functions of random variables.A three-stage estimation procedure is proposed to estimate both the parametric constant coefficients and nonparametric functional coefficients.Their asymptotic properties are investigated under a time series context,together with a new simple and easily implemented test for testing the goodness of fit of models and a bandwidth selector based on newly defined cross-validatory estimation for the expected forecasting expectile errors.The proposed methodology is data-analytic and of sufficient flexibility to analyze complex and multivariate nonlinear structures without suffering from the curse of dimensionality.Finally,the proposed model is illustrated by simulated data,and applied to analyzing the daily data of the S&P500 return series.展开更多
The present paper studies China's national level currency exposure since 2005 when the country adopted a new exchange rate regime allowing the renminbi (RMB) to move towards greater flexibility. Using generalized a...The present paper studies China's national level currency exposure since 2005 when the country adopted a new exchange rate regime allowing the renminbi (RMB) to move towards greater flexibility. Using generalized autoregressive conditional heteroskedastic and constant conditional correlation-generalized autoregressive conditional heteroskedastic methods to estimate the augmented capital asset pricing models with orthogonalized stock returns, we find that China equity indexes are significantly exposed to exchange rate movements. In a static setting, there is strong sensitivity of stock returns to movements of China's trade- weighted exchange rate, and to the bilateral exchange rates except the RMB/dollar rate. However, in a dynamic framework, exposure to all the bilateral currency pairs under examination is significant. The results indicate that under the new exchange rate regime, China's gradualist approach to moving towards greater exchange rate flexibility has managed to keep exposure to a moderate level. However, we find evidence that in a dynamic setting, the exposure of the RMB to the dollar and other major currencies is significant. For China, the challenge of managing currency risk exposure is looming greater.展开更多
文摘<p> <span style="font-family:Verdana;">To address the drawbacks of the traditional Parker test in multivariate linear</span><span style="font-family:;" "=""> </span><span style="font-family:Verdana;">models:</span><span style="font-family:;" "=""> </span><span style="font-family:Verdana;">the process is cumbersome and computationally intensive,</span><span style="font-family:;" "=""> </span><span style="font-family:Verdana;">we propose a new heteroscedasticity test.</span><span style="font-family:;" "=""> </span><span style="font-family:Verdana;">A new heteroskedasticity test is proposed using the fitted values of the samples as new explanatory variables, reconstructing the regression model, and giving a new heteroskedasticity test based on the significance test of the coefficients, it is also compared with the existing Parker test which is improved using the principal component idea. Numerical simulations and empirical analyses show that the improved Parker test with the fitted values of the samples proposed in this paper is superior.</span> </p>
文摘The presence of heteroskedasticity in a considered regression model may bias the standard deviations of parameters obtained by the Ordinary Least Square (OLS) method. In this case, several hypothesis tests on the model under consideration may be biased, for example, CHOW’s coefficient stability test (or structural change test), Student’s t-test and Fisher’s F-test. Most of the heteroscedasticity tests in the literature are based on the comparison of variances. Despite the multiplication of equality tests of coefficients of variation (CVs) that have appeared in the literature, to our knowledge, the first and only use of the coefficient of variation in the detection of heteroskedasticity was offered by Li and Yao in 2017. Thus, this paper offers an approach to determine the existence of heteroskedasticity by a test of equality of coefficients of variation. We verify by a Monte Carlo robustness and performance test that our method seems even better than some tests in the literature. The results of this study contribute to the exploitation of the statistical measurement of CV dispersion. They help technicians economists to better verify their hypotheses before making a scientific decision when making a necessary forecast, in order to contribute effectively to the economic and sustainable development of a company or enterprise.
文摘Unlike height-diameter equations for standing trees commonly used in forest resources modelling,tree height models for cut-to-length(CTL)stems tend to produce prediction errors whose distributions are not conditionally normal but are rather leptokurtic and heavy-tailed.This feature was merely noticed in previous studies but never thoroughly investigated.This study characterized the prediction error distribution of a newly developed such tree height model for Pin us radiata(D.Don)through the three-parameter Burr TypeⅫ(BⅫ)distribution.The model’s prediction errors(ε)exhibited heteroskedasticity conditional mainly on the small end relative diameter of the top log and also on DBH to a minor extent.Structured serial correlations were also present in the data.A total of 14 candidate weighting functions were compared to select the best two for weightingεin order to reduce its conditional heteroskedasticity.The weighted prediction errors(εw)were shifted by a constant to the positive range supported by the BXII distribution.Then the distribution of weighted and shifted prediction errors(εw+)was characterized by the BⅫdistribution using maximum likelihood estimation through 1000 times of repeated random sampling,fitting and goodness-of-fit testing,each time by randomly taking only one observation from each tree to circumvent the potential adverse impact of serial correlation in the data on parameter estimation and inferences.The nonparametric two sample Kolmogorov-Smirnov(KS)goodness-of-fit test and its closely related Kuiper’s(KU)test showed the fitted BⅫdistributions provided a good fit to the highly leptokurtic and heavy-tailed distribution ofε.Random samples generated from the fitted BⅫdistributions ofεw+derived from using the best two weighting functions,when back-shifted and unweighted,exhibited distributions that were,in about97 and 95%of the 1000 cases respectively,not statistically different from the distribution ofε.Our results for cut-tolength P.radiata stems represented the first case of any tree species where a non-normal error distribution in tree height prediction was described by an underlying probability distribution.The fitted BXII prediction error distribution will help to unlock the full potential of the new tree height model in forest resources modelling of P.radiata plantations,particularly when uncertainty assessments,statistical inferences and error propagations are needed in research and practical applications through harvester data analytics.
基金The National Natural Science Foundation of China(No.51108079)
文摘To improve the forecasting reliability of travel time, the time-varying confidence interval of travel time on arterials is forecasted using an autoregressive integrated moving average and generalized autoregressive conditional heteroskedasticity (ARIMA-GARCH) model. In which, the ARIMA model is used as the mean equation of the GARCH model to model the travel time levels and the GARCH model is used to model the conditional variances of travel time. The proposed method is validated and evaluated using actual traffic flow data collected from the traffic monitoring system of Kunshan city. The evaluation results show that, compared with the conventional ARIMA model, the proposed model cannot significantly improve the forecasting performance of travel time levels but has advantage in travel time volatility forecasting. The proposed model can well capture the travel time heteroskedasticity and forecast the time-varying confidence intervals of travel time which can better reflect the volatility of observed travel times than the fixed confidence interval provided by the ARIMA model.
基金supported by the National Natural Science Foundation of China(71631004,71571152)the Fundamental Research Funds for the Central Universities(20720171002,20720170090)the Fok Ying-Tong Education Foundation(151084)
文摘This paper highlights some recent developments in testing predictability of asset returns with focuses on linear mean regressions, quantile regressions and nonlinear regression models. For these models, when predictors are highly persistent and their innovations are contemporarily correlated with dependent variable, the ordinary least squares estimator has a finite-sample bias, and its limiting distribution relies on some unknown nuisance parameter, which is not consistently estimable. Without correcting these issues, conventional test statistics are subject to a serious size distortion and generate a misleading conclusion in testing pre- dictability of asset returns in real applications. In the past two decades, sequential studies have contributed to this subject and proposed various kinds of solutions, including, but not limit to, the bias-correction procedures, the linear projection approach, the IVX filtering idea, the variable addition approaches, the weighted empirical likelihood method, and the double-weight robust approach. Particularly, to catch up with the fast-growing literature in the recent decade, we offer a selective overview of these methods. Finally, some future research topics, such as the econometric theory for predictive regressions with structural changes, and nonparametric predictive models, and predictive models under a more general data setting, are also discussed.
文摘The aim of the paper is to provide some evidences on relationships among the degree of financial integration, stock exchange markets, and volatility of national market returns. In this paper, the authors employ correlation and cluster analyses in order to investigate the impact of stock exchange consolidation on volatility of market returns, in terms of a financial integration between involved stock exchanges before and after the merger. By using the Generalized Autoregressive Conditional Heteroskedasticity (GARCH) (1.1) model, the authors test the change in volatilities of national stock exchange markets involved in the following stock exchange integration case studies: Euronext, Bolsasy Mercados Espanoles (BME), and Swedish-Finnish financial services company (OMX). These three case studies are considered as completed cases of market consolidation, where the data are available enough to conduct the current research. By using daily data of national returns of engaged European stock markets from 1995 to 2007, the paper investigates the influence of stock exchange consolidation on volatility of national stock market returns. The obtained results confirm the gradual decrease of volatility in each of the integrated stock markets. However, the level of decrease in terms of volatility depends on economic characteristics of each engaged market and its degree of integration with other financial services. The results of correlation and cluster analyses confirm that stock operators have created significantly non-official integration links through cross-memberships and cross-listings even before the consolidations. Thus, the mergers among stock exchanges can be considered as the rational consequences of the high internal co-movements between involved markets. Furthermore, stock exchange markets with strong non-official integration links show an immediate decrease of volatility after the merger, meanwhile for others, it takes several years before the volatility can decrease as markets should reach the full integration.
文摘This paper uses the estimation of the Self-Excited Multi Fractal (SEMF) model, which holds theoretical promise but has seen mixed results in practice, as a case study to explore the impact of distributional assumptions on the model fitting process. In the case of the SEMF model, this examination shows that incorporating reasonable distributional assumptions including a non-zero mean and the leptokurtic Student’s t distribution can have a substantial impact on the estimation results and can mean the difference between parameter estimates that imply unstable and potentially explosive volatility dynamics versus ones that describe more reasonable and realistic dynamics for the returns. While the original SEMF model specification is found to yield unrealistic results for most of the series of financial returns to which it is applied, the results obtained after incorporating the Student’s t distribution and a mean component into the model specification suggest that the SEMF model is a reasonable model, implying realistic return behavior, for most, if not all, of the series of stock and index returns to which it is applied in this study. In addition, reflecting the sensitivity of the sample mean to the types of characteristics that the SEMF model is designed to capture, the results of this study also illustrate the value of incorporating the mean component directly into the model and fitting it in conjunction with the other model parameters rather than simply centering the returns beforehand by subtracting the sample mean from them.
文摘In the paper, a general framework for large scale modeling of macroeconomic and financial time series is introduced. The proposed approach is characterized by simplicity of implementation, performing well independently of persistence and heteroskedasticity properties, accounting for common deterministic and stochastic factors. Monte Carlo results strongly support the proposed methodology, validating its use also for relatively small cross-sectional and temporal samples.
基金The authors thank the Guest Editors and the anonymous referees for their helpful and constructive comments.The authors also acknowledge gratefully the partial financial support from the National Science Fund for Distinguished Young Scholars#71625001the Natural Science Foundation of China grants#7the scholarship from China Scholarship Council under the Grant CSC N201706310023.
文摘To characterize heteroskedasticity,nonlinearity,and asymmetry in tail risk,this study investigates a class of conditional (dynamic) expectile models with partially varying coefficients in which some coefficients are allowed to be constants,but others are allowed to be unknown functions of random variables.A three-stage estimation procedure is proposed to estimate both the parametric constant coefficients and nonparametric functional coefficients.Their asymptotic properties are investigated under a time series context,together with a new simple and easily implemented test for testing the goodness of fit of models and a bandwidth selector based on newly defined cross-validatory estimation for the expected forecasting expectile errors.The proposed methodology is data-analytic and of sufficient flexibility to analyze complex and multivariate nonlinear structures without suffering from the curse of dimensionality.Finally,the proposed model is illustrated by simulated data,and applied to analyzing the daily data of the S&P500 return series.
文摘The present paper studies China's national level currency exposure since 2005 when the country adopted a new exchange rate regime allowing the renminbi (RMB) to move towards greater flexibility. Using generalized autoregressive conditional heteroskedastic and constant conditional correlation-generalized autoregressive conditional heteroskedastic methods to estimate the augmented capital asset pricing models with orthogonalized stock returns, we find that China equity indexes are significantly exposed to exchange rate movements. In a static setting, there is strong sensitivity of stock returns to movements of China's trade- weighted exchange rate, and to the bilateral exchange rates except the RMB/dollar rate. However, in a dynamic framework, exposure to all the bilateral currency pairs under examination is significant. The results indicate that under the new exchange rate regime, China's gradualist approach to moving towards greater exchange rate flexibility has managed to keep exposure to a moderate level. However, we find evidence that in a dynamic setting, the exposure of the RMB to the dollar and other major currencies is significant. For China, the challenge of managing currency risk exposure is looming greater.