期刊文献+

基于大数据分析的复值BP神经网络方法 被引量:1

Picking up Complex Valued BP Neural Network Based on Big Data Analysis
下载PDF
导出
摘要 神经网络分析方法,现在已经成为极为成功的大数据分析方法,并在学术界和工业界全部被人所熟知,与传统的方式相比较,神经网络方法能够运用数据驱动等自动从数据中提取信息,并且在数据结构不明确、跨领域的大数据中具有极为明显的优势。复值BP神经网络对数据进行特征提取和抽象能力,从大量数据源中处理大数据中异构数据,这种分析方法的优势在于能够提取静态数据之间的相关信息。复值BP神经网络方法将会更加适用于提取信息的时序特征,从而加强对大数据的分析预测。基于大数据分析的神经网络方法可以对数据的时空关联以实现大数据的预测,解决大数据分析中的核心问题,为大数据分析研究提供了一定的科学依据。 The purpose neural network analysis method has now become an extremely successful big data analysis method, and it is well known in academia and industry. Compared with the traditional methods, the neural network method can use data-driven and other methods to automatically extract data In the extraction of information, and in the data structure is not clear, cross-cutting big data has a very obvious advantage. Methods: The complex-valued BP neural network is used to extract and abstract the features of data, and to process heterogeneous data in large data from a large number of data sources. The advantage of this method is that it can extract the related information between static data. Results: The complex-valued BP neural network method will be more suitable for extracting temporal characteristics of information, thus strengthening the analysis and prediction of big data. Conclusion: Neural network method based on big data analysis can predict spatio-temporal correlation of data in order to realize big data and solve the core problem in big data analysis, which provides a scientific basis for big data analysis and research.
作者 孙慧婷 闫磊 SUN Hui-ting;YAN Lei(Department of Information Engineering,Bozhou Vocational and Technical College,Bozhou Anhui 236800,Chin)
出处 《佳木斯大学学报(自然科学版)》 CAS 2018年第4期543-546,共4页 Journal of Jiamusi University:Natural Science Edition
基金 安徽省教育厅自然科学重点项目(KJ2018A0881 KJ2018A0887) 校级项目(BYK1713 BYK1508)
关键词 大数据 复值BP神经网络 数据分析 预测 big data complex valued BP neural network data analysis prediction
  • 相关文献

参考文献3

二级参考文献48

  • 1工信部电信研究院大数据白皮书(2014)[M].北京:工业和信息化部电信研究院,2014.
  • 2Intel IT Center.大数据101:非结构化数据分析[M/OL].[2015-07-01 ]. http://www. intel. cn/content/dam/www/public/cn/zh/pdfvs/Big-data-101-Unstructured-data-Analytics.pdf, 2012.
  • 3Hinton G E, Salakhutdinov R R. Reducing the dimensionalityof data with neural networks [J], Science, 2006, 313(5786): 504-507.
  • 4Rosenblatt F. The perceptron: A probabilistic model forinformation storage and organization in the brain. [J],Psychological Review, 1958,65(6) : 386-408.
  • 5Rumelhart D E,Hinton G E, Williams R J. Learningrepresentations by back-propagating errors [J]. Nature,1986, 323(6088): 533-536.
  • 6Huang G B, Chen L,Siew C K. Universal approximationusing incremental constructive feedforward networks withrandom hidden nodes [J]. IEEE Trans on Neural Networks,2006, 17(4): 879-892.
  • 7Vincent P,Larochelle H, Lajoie I,et al. Stacked denoisingautoencoders : Learning useful representations in a deepnetwork with a local denoising criterion [J]. Journal ofMachine Learning Research, 2010,11: 3371-3408.
  • 8LeCun Y, Bottou L, Bengio Y. Gradient-based learningapplied to document recognition [J]. Proceedings of theIEEE, 1998, 86(11): 2278-2324.
  • 9Williams R J,Zipser D. Gradient-based learning algorithmsfor recurrent connectionist networks, NU-CCS-90-9 [R].Boston,MA: Northeastern University, 1990.
  • 10Robinson A J, Fallside F. The utility driven dynamic errorpropagation network [R]. Cambridge, UK: Department ofEngineering, University of Cambridge, 1987.

共引文献186

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部