期刊文献+

基于混核LSSVM的批特征风功率预测方法 被引量:10

Wind Power Prediction Method Using Hybrid Kernel LSSVM With Batch Feature
下载PDF
导出
摘要 针对风电场风功率预测问题,利用历史风功率、气象数据和测风塔实时数据等相关信息,提出了带有批特征的混核最小二乘支持向量机(Hybrid kernel least squares support vector machine,HKLSSVM)方法,建立风电场风功率预测模型.为了增强模型的适应性,设计改进的差分进化算法对模型参数进行优化,并利用稀疏选择方法来选取合适的训练样本集,缩短建模时间,保证预测模型精度.根据风场风机的地理位置分布情况,提出批划分的建模策略,对相近地理位置的风机进行组批,替代传统风场风功率预测方法.通过风场中实际数据进行测试,实验结果表明与其他预测方法相比,本文提出的方法能够提高预测精度和效率,减少风电波动性对电网的影响,从而提高电网的安全性和可靠性. For the wind power prediction problem in a wind farm,this paper collects some related data such as historical wind power data,meteorological data,and wind speed data sampled by anemometer tower.Then,a wind power prediction method with batch feature is proposed,which is based on hybrid kernel least squares support vector machine(HKLSSVM).It is used to establish the wind power prediction model in the wind farm.To enhance the model’s adaptability,an improved differential evolution algorithm is designed to optimize the model parameters,and a sparse selection method is used to select the appropriate training samples set.Thus,the modeling time is shortened and the prediction model accuracy is guaranteed.According to the location distribution of wind turbines in the wind farm,a modeling strategy based on batch partition is proposed,some wind turbines at similar locations can be clustered by batch strategy,which is used instead of the traditional wind power prediction methods in the wind farm.The proposed model is tested through the real data in the wind farm.Experimental results show that the proposed method can improve the accuracy and efficiency of wind power prediction compared with other prediction methods,and can reduce the effect of the wind fluctuation.Hence it can ensure the safety and reliability of the power grid.
作者 刘畅 郎劲 LIU Chang;LANG Jin(Key Laboratory of Data Analytics and Optimization for Smart Industry(Northeastern University),Ministry of Educa-tion,Shenyang 110819;Institute of Industrial and Systems Engineering,Northeastern University,Shenyang 110819;Liaoning Key Laboratory of Manufacturing System and Logis-tics,Northeastern University,Shenyang 110819;State Key Laboratory of Synthetical Automation for Process Industries,Northeastern University,Shenyang 110819)
出处 《自动化学报》 EI CSCD 北大核心 2020年第6期1264-1273,共10页 Acta Automatica Sinica
基金 国家重点研究发展计划基金(2016YFB0901900) 国家自然科学基金重点国际合作项目(71520107004) 流程工业综合自动化国家重点实验室基础研究项目基金(2013ZCX02) 111引智基地基金(B16009)资助。
关键词 风功率预测 批特征 混核最小二乘支持向量机 差分进化 稀疏选择 Wind power prediction batch feature hybrid kernel least squares support vector machine(HKLSSVM) differential evolution(DE) sparse selection
  • 相关文献

参考文献6

二级参考文献149

  • 1Phan X H, Nguyen M L, Horiguchi S. Learning to classify short and sparse text & web with hidden topics from large- scale data collections. In: Proceedings of the 17th Interna- tional Conference on World Wide Web (WWW'08). New York, USA: ACM, 2008. 91-100.
  • 2Belkin M, Niyogi P, Sindhwani V, Bartlett P. Manifold regu- larization: a geometric framework for learning from labeled and unlabolod oxamplos. Journal of Machine Learning Re- search, 2006, 7(1): 2399-2434.
  • 3Hofmann T, SchSlkopf, Smola A J. Kernel methods in ma- chine learning. Annals of Statistics, 2007, 36(3): 1171-1220.
  • 4Sriperumbadur B K, Fukumizu K, Gretton A, Lanckriet G R G, Scholkopf B. Kernel choice and classifiability for RKHS embeddings of probability distributions. In: Advances in Neural Information Processing Systems 22, the 23rd An- nual Conference on Neural Information Processing Systems (NIPS 2009). Red Hook, NY: MIT Press, 2010. 1750-1758.
  • 5Smola A, Gretton A, Song L, SchSlkopf B. A Hilbert space embeddihg for distributions. In: Proceedings of the 18th International Conference on Algorithmic Learning Theory. Sendai, Japan: Springer-Verlag, 2007. 13-31.
  • 6Wu Y C, Liu Y F. Robust truncated hinge loss support vector machines. Journal of the American Statistical Asso- ciation, 2007, 102(479): 974-983.
  • 7Scholkopf B, Herbrich R, Smola A J. A generalized represen- ter theorem. In: Proceedings of the 14th Annual Conference on Computational Learning Theory and 5th European Con- ference on Computational Learning Theory (COLT'2001). Amsterdam, UK: Springer Press, 2001. 416- 426.
  • 8Kanamori T, Hido S, Sugiyama M. A least-squares approach to direct importance estimation. Journal of Machine Learn- ing Research, 2009, 10(1): 1391-1445.
  • 9Szedmak S, Shawe-Taylor J. Multiclass Learning at One- class Complexity. Technical Report No: 1508, School of Electronics and Computer Science, Southampton, UK, 2005.
  • 10Blitzer J, Crammer K, Kulesza A, Pereira F, Wortman J. Learning bounds for domain adaptation. In: Proceedings of the Neural Information Processing Systems (NIPS) 2006. Cambridge, MA: MIT Press, 2007.

共引文献150

同被引文献149

引证文献10

二级引证文献60

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部