Purpose-The aim of this study is to propose a deep neural network(DNN)method that uses side information to improve clustering results for big datasets;also,the authors show that applying this information improves the ...Purpose-The aim of this study is to propose a deep neural network(DNN)method that uses side information to improve clustering results for big datasets;also,the authors show that applying this information improves the performance of clustering and also increase the speed of the network training convergence.Design/methodology/approach-In data mining,semisupervised learning is an interesting approach because good performance can be achieved with a small subset of labeled data;one reason is that the data labeling is expensive,and semisupervised learning does not need all labels.One type of semisupervised learning is constrained clustering;this type of learning does not use class labels for clustering.Instead,it uses information of some pairs of instances(side information),and these instances maybe are in the same cluster(must-link[ML])or in different clusters(cannot-link[CL]).Constrained clustering was studied extensively;however,little works have focused on constrained clustering for big datasets.In this paper,the authors have presented a constrained clustering for big datasets,and the method uses a DNN.The authors inject the constraints(ML and CL)to this DNN to promote the clustering performance and call it constrained deep embedded clustering(CDEC).In this manner,an autoencoder was implemented to elicit informative low dimensional features in the latent space and then retrain the encoder network using a proposed Kullback-Leibler divergence objective function,which captures the constraints in order to cluster the projected samples.The proposed CDEC has been compared with the adversarial autoencoder,constrained 1-spectral clustering and autoencoder t k-means was applied to the known MNIST,Reuters-10k and USPS datasets,and their performance were assessed in terms of clustering accuracy.Empirical results confirmed the statistical superiority of CDEC in terms of clustering accuracy to the counterparts.Findings-First of all,this is the first DNN-constrained clustering that uses side information to improve the performance of clustering without using labels in big datasets with high dimension.Second,the author defined a formula to inject side information to the DNN.Third,the proposed method improves clustering performance and network convergence speed.Originality/value-Little works have focused on constrained clustering for big datasets;also,the studies in DNNs for clustering,with specific loss function that simultaneously extract features and clustering the data,are rare.The method improves the performance of big data clustering without using labels,and it is important because the data labeling is expensive and time-consuming,especially for big datasets.展开更多
文摘Purpose-The aim of this study is to propose a deep neural network(DNN)method that uses side information to improve clustering results for big datasets;also,the authors show that applying this information improves the performance of clustering and also increase the speed of the network training convergence.Design/methodology/approach-In data mining,semisupervised learning is an interesting approach because good performance can be achieved with a small subset of labeled data;one reason is that the data labeling is expensive,and semisupervised learning does not need all labels.One type of semisupervised learning is constrained clustering;this type of learning does not use class labels for clustering.Instead,it uses information of some pairs of instances(side information),and these instances maybe are in the same cluster(must-link[ML])or in different clusters(cannot-link[CL]).Constrained clustering was studied extensively;however,little works have focused on constrained clustering for big datasets.In this paper,the authors have presented a constrained clustering for big datasets,and the method uses a DNN.The authors inject the constraints(ML and CL)to this DNN to promote the clustering performance and call it constrained deep embedded clustering(CDEC).In this manner,an autoencoder was implemented to elicit informative low dimensional features in the latent space and then retrain the encoder network using a proposed Kullback-Leibler divergence objective function,which captures the constraints in order to cluster the projected samples.The proposed CDEC has been compared with the adversarial autoencoder,constrained 1-spectral clustering and autoencoder t k-means was applied to the known MNIST,Reuters-10k and USPS datasets,and their performance were assessed in terms of clustering accuracy.Empirical results confirmed the statistical superiority of CDEC in terms of clustering accuracy to the counterparts.Findings-First of all,this is the first DNN-constrained clustering that uses side information to improve the performance of clustering without using labels in big datasets with high dimension.Second,the author defined a formula to inject side information to the DNN.Third,the proposed method improves clustering performance and network convergence speed.Originality/value-Little works have focused on constrained clustering for big datasets;also,the studies in DNNs for clustering,with specific loss function that simultaneously extract features and clustering the data,are rare.The method improves the performance of big data clustering without using labels,and it is important because the data labeling is expensive and time-consuming,especially for big datasets.