This paper derives a new decomposition of stock returns using price extremes and proposes a conditional autoregressive shape(CARS)model with beta density to predict the direction of stock returns.The CARS model is con...This paper derives a new decomposition of stock returns using price extremes and proposes a conditional autoregressive shape(CARS)model with beta density to predict the direction of stock returns.The CARS model is continuously valued,which makes it different from binary classification models.An empirical study is performed on the US stock market,and the results show that the predicting power of the CARS model is not only statistically significant but also economically valuable.We also compare the CARS model with the probit model,and the results demonstrate that the proposed CARS model outperforms the probit model for return direction forecasting.The CARS model provides a new framework for return direction forecasting.展开更多
In this paper,we propose a structural developmental neural network to address the plasticity‐stability dilemma,computational inefficiency,and lack of prior knowledge in continual unsupervised learning.This model uses...In this paper,we propose a structural developmental neural network to address the plasticity‐stability dilemma,computational inefficiency,and lack of prior knowledge in continual unsupervised learning.This model uses competitive learning rules and dynamic neurons with information saturation to achieve parameter adjustment and adaptive structure development.Dynamic neurons adjust the information saturation after winning the competition and use this parameter to modulate the neuron parameter adjustment and the division timing.By dividing to generate new neurons,the network not only keeps sensitive to novel features but also can subdivide classes learnt repeatedly.The dynamic neurons with information saturation and division mechanism can simulate the long short‐term memory of the human brain,which enables the network to continually learn new samples while maintaining the previous learning results.The parent‐child relationship between neurons arising from neuronal division enables the network to simulate the human cognitive process that gradually refines the perception of objects.By setting the clustering layer parameter,users can choose the desired degree of class subdivision.Experimental results on artificial and real‐world datasets demonstrate that the proposed model is feasible for unsupervised learning tasks in instance increment and class incre-ment scenarios and outperforms prior structural developmental neural networks.展开更多
By decomposing asset returns into potential maximum gain(PMG)and potential maximum loss(PML)with price extremes,this study empirically investigated the relationships between PMG and PML.We found significant asymmetry ...By decomposing asset returns into potential maximum gain(PMG)and potential maximum loss(PML)with price extremes,this study empirically investigated the relationships between PMG and PML.We found significant asymmetry between PMG and PML.PML significantly contributed to forecasting PMG but not vice versa.We further explored the power of this asymmetry for predicting asset returns and found it could significantly improve asset return predictability in both in-sample and out-of-sample forecasting.Investors who incorporate this asymmetry into their investment decisions can get substantial utility gains.This asymmetry remains significant even when controlling for macroeconomic variables,technical indicators,market sentiment,and skewness.Moreover,this asymmetry was found to be quite general across different countries.展开更多
Because the number of clustering cores needs to be set before implementing the K-means algorithm,this type of algorithm often fails in applications with increasing data and changing distribution characteristics.This p...Because the number of clustering cores needs to be set before implementing the K-means algorithm,this type of algorithm often fails in applications with increasing data and changing distribution characteristics.This paper proposes an evolutionary algorithm DCC,which can dynamically adjust the number of clustering cores with data change.DCC algorithm uses the Gaussian function as the activation function of each core.Each clustering core can adjust its center vector and coverage based on the response to the input data and its memory state to better fit the sample clusters in the space.The DCC algorithm model can evolve from 0.After each new sample is added,the winning dynamic core can be adjusted or split by competitive learning,so that the number of clustering cores of the algorithm always maintains a better adaptation relationship with the existing data.Furthermore,because its clustering core can split,it can subdivide the densely distributed data clusters.Finally,detailed experimental results show that the evolutionary clustering algorithm DCC based on the dynamic core method has excellent clustering performance and strong robustness.展开更多
基金Funding was provided by National Social Science Fund of China(Grant No.22BJY259)National Natural Science Foundation of China(Grant Nos.71971004,72271055)Research on Modeling of Return Rate Based on Mixed Distribution and Its Application in Risk Management(Grant No.19YB26).
文摘This paper derives a new decomposition of stock returns using price extremes and proposes a conditional autoregressive shape(CARS)model with beta density to predict the direction of stock returns.The CARS model is continuously valued,which makes it different from binary classification models.An empirical study is performed on the US stock market,and the results show that the predicting power of the CARS model is not only statistically significant but also economically valuable.We also compare the CARS model with the probit model,and the results demonstrate that the proposed CARS model outperforms the probit model for return direction forecasting.The CARS model provides a new framework for return direction forecasting.
基金supported by the National Natural Science Foundation of China(Grants Nos.61825305 and U21A20518).
文摘In this paper,we propose a structural developmental neural network to address the plasticity‐stability dilemma,computational inefficiency,and lack of prior knowledge in continual unsupervised learning.This model uses competitive learning rules and dynamic neurons with information saturation to achieve parameter adjustment and adaptive structure development.Dynamic neurons adjust the information saturation after winning the competition and use this parameter to modulate the neuron parameter adjustment and the division timing.By dividing to generate new neurons,the network not only keeps sensitive to novel features but also can subdivide classes learnt repeatedly.The dynamic neurons with information saturation and division mechanism can simulate the long short‐term memory of the human brain,which enables the network to continually learn new samples while maintaining the previous learning results.The parent‐child relationship between neurons arising from neuronal division enables the network to simulate the human cognitive process that gradually refines the perception of objects.By setting the clustering layer parameter,users can choose the desired degree of class subdivision.Experimental results on artificial and real‐world datasets demonstrate that the proposed model is feasible for unsupervised learning tasks in instance increment and class incre-ment scenarios and outperforms prior structural developmental neural networks.
基金This research is supported by National Natural Science Foundation of China under Grant No.71401033Program for Young Excellent Talents,UIBE under Grant No.15YQ08.
文摘By decomposing asset returns into potential maximum gain(PMG)and potential maximum loss(PML)with price extremes,this study empirically investigated the relationships between PMG and PML.We found significant asymmetry between PMG and PML.PML significantly contributed to forecasting PMG but not vice versa.We further explored the power of this asymmetry for predicting asset returns and found it could significantly improve asset return predictability in both in-sample and out-of-sample forecasting.Investors who incorporate this asymmetry into their investment decisions can get substantial utility gains.This asymmetry remains significant even when controlling for macroeconomic variables,technical indicators,market sentiment,and skewness.Moreover,this asymmetry was found to be quite general across different countries.
文摘Because the number of clustering cores needs to be set before implementing the K-means algorithm,this type of algorithm often fails in applications with increasing data and changing distribution characteristics.This paper proposes an evolutionary algorithm DCC,which can dynamically adjust the number of clustering cores with data change.DCC algorithm uses the Gaussian function as the activation function of each core.Each clustering core can adjust its center vector and coverage based on the response to the input data and its memory state to better fit the sample clusters in the space.The DCC algorithm model can evolve from 0.After each new sample is added,the winning dynamic core can be adjusted or split by competitive learning,so that the number of clustering cores of the algorithm always maintains a better adaptation relationship with the existing data.Furthermore,because its clustering core can split,it can subdivide the densely distributed data clusters.Finally,detailed experimental results show that the evolutionary clustering algorithm DCC based on the dynamic core method has excellent clustering performance and strong robustness.