This paper studies approximation capability to L^2(Rd) functions of incremental constructive feedforward neural networks (FNN) with random hidden units. Two kinds of therelayered feedforward neural networks are co...This paper studies approximation capability to L^2(Rd) functions of incremental constructive feedforward neural networks (FNN) with random hidden units. Two kinds of therelayered feedforward neural networks are considered: radial basis function (RBF) neural networks and translation and dilation invariant (TDI) neural networks. In comparison with conventional methods that existence approach is mainly used in approximation theories for neural networks, we follow a constructive approach to prove that one may simply randomly choose parameters of hidden units and then adjust the weights between the hidden units and the output unit to make the neural network approximate any function in L2 (Rd) to any accuracy. Our result shows given any non-zero activation function g : R+ → R and g(||x||R^d) ∈ L^2(Rd) for RBF hidden units, or any non-zero activation function g(x) ∈ L^2(R^d) for TDI hidden units, the incremental network function fn with randomly generated hidden units converges to any target function in L2 (R^d) with probability one as the number of hidden units n → ∞, if one only properly adjusts the weights between the hidden units and output unit.展开更多
In this paper, we will illustrate the use and power of Hidden Markov models in analyzing multivariate data over time. The data used in this study was obtained from the Organization for Economic Co-operation and Develo...In this paper, we will illustrate the use and power of Hidden Markov models in analyzing multivariate data over time. The data used in this study was obtained from the Organization for Economic Co-operation and Development (OECD. Stat database url: https://stats.oecd.org/) and encompassed monthly data on the employment rate of males and females in Canada and the United States (aged 15 years and over;seasonally adjusted from January 1995 to July 2018). Two different underlying patterns of trends in employment over the 23 years observation period were uncovered.展开更多
基金Supported by the National Nature Science Foundation of China (Grant No10871220)"Mathematics+X" of DLUT (Grant No842328)
文摘This paper studies approximation capability to L^2(Rd) functions of incremental constructive feedforward neural networks (FNN) with random hidden units. Two kinds of therelayered feedforward neural networks are considered: radial basis function (RBF) neural networks and translation and dilation invariant (TDI) neural networks. In comparison with conventional methods that existence approach is mainly used in approximation theories for neural networks, we follow a constructive approach to prove that one may simply randomly choose parameters of hidden units and then adjust the weights between the hidden units and the output unit to make the neural network approximate any function in L2 (Rd) to any accuracy. Our result shows given any non-zero activation function g : R+ → R and g(||x||R^d) ∈ L^2(Rd) for RBF hidden units, or any non-zero activation function g(x) ∈ L^2(R^d) for TDI hidden units, the incremental network function fn with randomly generated hidden units converges to any target function in L2 (R^d) with probability one as the number of hidden units n → ∞, if one only properly adjusts the weights between the hidden units and output unit.
文摘In this paper, we will illustrate the use and power of Hidden Markov models in analyzing multivariate data over time. The data used in this study was obtained from the Organization for Economic Co-operation and Development (OECD. Stat database url: https://stats.oecd.org/) and encompassed monthly data on the employment rate of males and females in Canada and the United States (aged 15 years and over;seasonally adjusted from January 1995 to July 2018). Two different underlying patterns of trends in employment over the 23 years observation period were uncovered.