摘要
为了增强自编码器的特征提取能力,更好的利用变压器故障时产生的大量无标签数据,将批量标准化(Batch Normalizaion,BN)引入了传统堆栈自编码器(Stacked auto-encoder,SAE)的编码和解码过程,形成了改进堆栈自编码器(BN-SAE)。以单层AE作为底层网络,输入样本为变压器油中溶解气体含量,通过仿真确定神经网络的结构,用无标签数据对网络进行无监督学习,提取变压器故障特征信息,最后输入有标签数据通过反向传播算法对网络进行微调。算例分析表明,BN-SAE相比于传统的SAE与AE,训练误差更小,特征提取更佳,对变压器故障分类的准确率更高,并且少数类故障样本也可以得到很好的分类。
In order to enhance the feature extraction ability of auto-encoder and make better use of mass unlabeled data generated from transformer failures,this paper introduces Batch Normalization(BN)into the coding and decoding process of the traditional Stack Auto Encoder(SAE),forming an improved Stack Auto Encoder(BN-SAE).With onelayer AE as the bottom network and the dissolved gas content in the transformer oil as input sample,we determine the structure of the neural network by simulation,and carry out unsupervised learning of the network with the unlabeled data.Then,we extract the fault feature information of the transformer,and finally input the labeled data to fine tune the network through the reverse transmission algorithm.The example analysis shows that compared with the traditional SAE and AE,BN-SAE features smaller training error,better feature extraction,and higher accuracy of transformer fault classification.Moreover,it is able to classify minor fault samples.
作者
赵冬梅
王闯
马泰屹
ZHAO Dongmei;WANG Chuang;MA Taiyi(School of Electrical and Electronic Engineering,North China Electric Power University,Beijing 102206,China)
出处
《华北电力大学学报(自然科学版)》
CAS
北大核心
2020年第6期61-67,共7页
Journal of North China Electric Power University:Natural Science Edition
关键词
变压器
故障诊断
堆栈自编码器
批量归一化
深度学习
transformer
fault diagnosis
stacking auto-encoder
batch normalization
deep learning