The novel information criterion (NIC) algorithm can find the principal subspace quickly, but it is not an actual principal component analysis (PCA) algorithm and hence it cannot find the orthonormal eigen-space wh...The novel information criterion (NIC) algorithm can find the principal subspace quickly, but it is not an actual principal component analysis (PCA) algorithm and hence it cannot find the orthonormal eigen-space which corresponds to the principal component of input vector. This defect limits its application in practice. By weighting the neural network's output of NIC, a modified novel information criterion (MNIC) algorithm is presented. MNIC extractes the principal components and corresponding eigenvectors in a parallel online learning program, and overcomes the NIC's defect. It is proved to have a single global optimum and nonquadratic convergence rate, which is superior to the conventional PCA online algorithms such as Oja and LMSER. The relationship among Oja, LMSER and MNIC is exhibited. Simulations show that MNIC could converge to the optimum fast. The validity of MNIC is proved.展开更多
文摘The novel information criterion (NIC) algorithm can find the principal subspace quickly, but it is not an actual principal component analysis (PCA) algorithm and hence it cannot find the orthonormal eigen-space which corresponds to the principal component of input vector. This defect limits its application in practice. By weighting the neural network's output of NIC, a modified novel information criterion (MNIC) algorithm is presented. MNIC extractes the principal components and corresponding eigenvectors in a parallel online learning program, and overcomes the NIC's defect. It is proved to have a single global optimum and nonquadratic convergence rate, which is superior to the conventional PCA online algorithms such as Oja and LMSER. The relationship among Oja, LMSER and MNIC is exhibited. Simulations show that MNIC could converge to the optimum fast. The validity of MNIC is proved.