GENERAL INFORMATIONNeural Regeneration Research (NRR; ISSN 1673-5374)is an open-access (www.nrronline.org), peer-reviewed international journal focusing exclusively on the exciting field of neural regeneration res...GENERAL INFORMATIONNeural Regeneration Research (NRR; ISSN 1673-5374)is an open-access (www.nrronline.org), peer-reviewed international journal focusing exclusively on the exciting field of neural regeneration research, with 36 issues published per year. NRR is devoted to publishing basic research, translational medicine and randomized clinical trial papers, as well as prospective reviews written by invited experts and academic discussion papers in the field of neural regeneration. NRR aims to publish timely, innovative and creative basic and clinical research with the highest standards in neural regeneration research. NRR publishes a diverse variety of topics in neural regeneration, including brain, spinal cord and peripheral nerve injury, traditional Chinese medicine, acupuncture and moxibustion, stem cells, tissue engineering, inflammation, glial scar, gene therapy, biological factors, neurorehabilitation, neuroimaging, neurodegenerative diseases, neuroplasticity and neurogenesis.展开更多
针对配电网故障定位问题提出一种基于人工神经网络(ANN)中结构较为简单并且可塑性强的误差反向传播(Error Back Propagation,BP)神经网络方法的定位模型。建立BP网络模型,并将训练好的BP网络模型和通过云遗传算法改进后的BP网络模型,应...针对配电网故障定位问题提出一种基于人工神经网络(ANN)中结构较为简单并且可塑性强的误差反向传播(Error Back Propagation,BP)神经网络方法的定位模型。建立BP网络模型,并将训练好的BP网络模型和通过云遗传算法改进后的BP网络模型,应用于同一个简单的配电网系统中,分别对不同分支的反射信息进行特征提取与模式识别。通过对两种算法的训练曲线图和诊断精度的比较来反映优化算法的高效性和准确性,最终得以确定诊断的实际输出值,实现故障分支的判别和精确定位。展开更多
This paper proposes the compensating methods for feedforward neural networks (FNNs) which are very difficult to train by traditional Back Propagation (BP) methods. For an FNN trapped in local minima the compensating m...This paper proposes the compensating methods for feedforward neural networks (FNNs) which are very difficult to train by traditional Back Propagation (BP) methods. For an FNN trapped in local minima the compensating methods can correct the wrong outputs one by one until all outputs are right, then the network is located at a global optimum point. A hidden neuron is added to compensate for a binary input three-layer FNN trapped in a local minimum, and one or two hidden neurons are added to compensate for a real input three-layer FNN. For a more than three layers FNN, the second hidden layer from behind will be temporarily treated as the input layer during compensation, hence the above methods can also be used.展开更多
文摘GENERAL INFORMATIONNeural Regeneration Research (NRR; ISSN 1673-5374)is an open-access (www.nrronline.org), peer-reviewed international journal focusing exclusively on the exciting field of neural regeneration research, with 36 issues published per year. NRR is devoted to publishing basic research, translational medicine and randomized clinical trial papers, as well as prospective reviews written by invited experts and academic discussion papers in the field of neural regeneration. NRR aims to publish timely, innovative and creative basic and clinical research with the highest standards in neural regeneration research. NRR publishes a diverse variety of topics in neural regeneration, including brain, spinal cord and peripheral nerve injury, traditional Chinese medicine, acupuncture and moxibustion, stem cells, tissue engineering, inflammation, glial scar, gene therapy, biological factors, neurorehabilitation, neuroimaging, neurodegenerative diseases, neuroplasticity and neurogenesis.
文摘针对配电网故障定位问题提出一种基于人工神经网络(ANN)中结构较为简单并且可塑性强的误差反向传播(Error Back Propagation,BP)神经网络方法的定位模型。建立BP网络模型,并将训练好的BP网络模型和通过云遗传算法改进后的BP网络模型,应用于同一个简单的配电网系统中,分别对不同分支的反射信息进行特征提取与模式识别。通过对两种算法的训练曲线图和诊断精度的比较来反映优化算法的高效性和准确性,最终得以确定诊断的实际输出值,实现故障分支的判别和精确定位。
基金This work is supported by Laboratory of Management,Decision and Information Systems,Academia Sinica.
文摘This paper proposes the compensating methods for feedforward neural networks (FNNs) which are very difficult to train by traditional Back Propagation (BP) methods. For an FNN trapped in local minima the compensating methods can correct the wrong outputs one by one until all outputs are right, then the network is located at a global optimum point. A hidden neuron is added to compensate for a binary input three-layer FNN trapped in a local minimum, and one or two hidden neurons are added to compensate for a real input three-layer FNN. For a more than three layers FNN, the second hidden layer from behind will be temporarily treated as the input layer during compensation, hence the above methods can also be used.