The structure and function of brain networks have been altered in patients with end-stage renal disease(ESRD).Manifold regularization(MR)only considers the pairing relationship between two brain regions and cannot rep...The structure and function of brain networks have been altered in patients with end-stage renal disease(ESRD).Manifold regularization(MR)only considers the pairing relationship between two brain regions and cannot represent functional interactions or higher-order relationships between multiple brain regions.To solve this issue,we developed a method to construct a dynamic brain functional network(DBFN)based on dynamic hypergraph MR(DHMR)and applied it to the classification of ESRD associated with mild cognitive impairment(ESRDaMCI).The construction of DBFN with Pearson’s correlation(PC)was transformed into an optimization model.Node convolution and hyperedge convolution superposition were adopted to dynamically modify the hypergraph structure,and then got the dynamic hypergraph to form the manifold regular terms of the dynamic hypergraph.The DHMR and L_(1) norm regularization were introduced into the PC-based optimization model to obtain the final DHMR-based DBFN(DDBFN).Experiment results demonstrated the validity of the DDBFN method by comparing the classification results with several related brain functional network construction methods.Our work not only improves better classification performance but also reveals the discriminative regions of ESRDaMCI,providing a reference for clinical research and auxiliary diagnosis of concomitant cognitive impairments.展开更多
This paper proposes a novel graph-based transductive learning algorithm based on manifold regularization. First, the manifold regularization was introduced to probabilistic discriminant model for semi-supervised class...This paper proposes a novel graph-based transductive learning algorithm based on manifold regularization. First, the manifold regularization was introduced to probabilistic discriminant model for semi-supervised classification task. And then a variation of the expectation maximization (EM) algorithm was derived to solve the optimization problem, which leads to an iterative algorithm. Although our method is developed in probabilistic framework, there is no need to make assumption about the specific form of data distribution. Besides, the crucial updating formula has closed form. This method was evaluated for text categorization on two standard datasets, 20 news group and Reuters-21578. Experiments show that our approach outperforms the state-of-the-art graph-based transductive learning methods.展开更多
Manifold regularization(MR)provides a powerful framework for semi-supervised classification using both the labeled and unlabeled data.It constrains that similar instances over the manifold graph should share similar c...Manifold regularization(MR)provides a powerful framework for semi-supervised classification using both the labeled and unlabeled data.It constrains that similar instances over the manifold graph should share similar classification out-puts according to the manifold assumption.It is easily noted that MR is built on the pairwise smoothness over the manifold graph,i.e.,the smoothness constraint is implemented over all instance pairs and actually considers each instance pair as a single operand.However,the smoothness can be pointwise in nature,that is,the smoothness shall inherently occur“everywhere"to relate the behavior of each point or instance to that of its close neighbors.Thus in this paper,we attempt to de-velop a pointwise MR(PW_MR for short)for semi-supervised learning through constraining on individual local instances.In this way,the pointwise nature of smoothness is preserved,and moreover,by considering individual instances rather than instance pairs,the importance or contribution of individual instances can be introduced.Such importance can be described by the confidence for correct prediction,or the local density,for example.PW.MR provides a different way for implementing manifold smoothness Finally,empirical results show the competitiveness of PW_MR compared to pairwise MR.展开更多
This paper aims to propose a framework for manifold regularization(MR) based distributed semi-supervised learning(DSSL) using single layer feed-forward neural network(SLFNN). The proposed framework, denoted as DSSL-SL...This paper aims to propose a framework for manifold regularization(MR) based distributed semi-supervised learning(DSSL) using single layer feed-forward neural network(SLFNN). The proposed framework, denoted as DSSL-SLFNN is based on the SLFNN, MR framework, and distributed optimization strategy. Then, a series of algorithms are derived to solve DSSL problems. In DSSL problems, data consisting of labeled and unlabeled samples are distributed over a communication network, where each node has only access to its own data and can only communicate with its neighbors. In some scenarios, DSSL problems cannot be solved by centralized algorithms. According to the DSSL-SLFNN framework, each node over the communication network exchanges the initial parameters of the SLFNN with the same basis functions for semi-supervised learning(SSL). All nodes calculate the global optimal coefficients of the SLFNN by using distributed datasets and local updates. During the learning process, each node only exchanges local coefficients with its neighbors rather than raw data. It means that DSSL-SLFNN based algorithms work in a fully distributed fashion and are privacy preserving methods. Finally, several simulations are presented to show the efficiency of the proposed framework and the derived algorithms.展开更多
In general,data contain noises which come from faulty instruments,flawed measurements or faulty communication.Learning with data in the context of classification or regression is inevitably affected by noises in the d...In general,data contain noises which come from faulty instruments,flawed measurements or faulty communication.Learning with data in the context of classification or regression is inevitably affected by noises in the data.In order to remove or greatly reduce the impact of noises,we introduce the ideas of fuzzy membership functions and the Laplacian twin support vector machine(Lap-TSVM).A formulation of the linear intuitionistic fuzzy Laplacian twin support vector machine(IFLap-TSVM)is presented.Moreover,we extend the linear IFLap-TSVM to the nonlinear case by kernel function.The proposed IFLap-TSVM resolves the negative impact of noises and outliers by using fuzzy membership functions and is a more accurate reasonable classi-fier by using the geometric distribution information of labeled data and unlabeled data based on manifold regularization.Experiments with constructed artificial datasets,several UCI benchmark datasets and MNIST dataset show that the IFLap-TSVM has better classification accuracy than other state-of-the-art twin support vector machine(TSVM),intuitionistic fuzzy twin support vector machine(IFTSVM)and Lap-TSVM.展开更多
基金supported by the National Natural Science Foundation of China (No.51877013),(ZJ),(http://www.nsfc.gov.cn/)the Jiangsu Provincial Key Research and Development Program (No.BE2021636),(ZJ),(http://kxjst.jiangsu.gov.cn/)+1 种基金the Science and Technology Project of Changzhou City (No.CE20205056),(ZJ),(http://kjj.changzhou.gov.cn/)by Qing Lan Project of Jiangsu Province (no specific grant number),(ZJ),(http://jyt.jiangsu.gov.cn/).
文摘The structure and function of brain networks have been altered in patients with end-stage renal disease(ESRD).Manifold regularization(MR)only considers the pairing relationship between two brain regions and cannot represent functional interactions or higher-order relationships between multiple brain regions.To solve this issue,we developed a method to construct a dynamic brain functional network(DBFN)based on dynamic hypergraph MR(DHMR)and applied it to the classification of ESRD associated with mild cognitive impairment(ESRDaMCI).The construction of DBFN with Pearson’s correlation(PC)was transformed into an optimization model.Node convolution and hyperedge convolution superposition were adopted to dynamically modify the hypergraph structure,and then got the dynamic hypergraph to form the manifold regular terms of the dynamic hypergraph.The DHMR and L_(1) norm regularization were introduced into the PC-based optimization model to obtain the final DHMR-based DBFN(DDBFN).Experiment results demonstrated the validity of the DDBFN method by comparing the classification results with several related brain functional network construction methods.Our work not only improves better classification performance but also reveals the discriminative regions of ESRDaMCI,providing a reference for clinical research and auxiliary diagnosis of concomitant cognitive impairments.
基金supported by the Mechanism Socialist Method and Higher Intelligence Theory of the National Natural Science Fund Projects(60873001)
文摘This paper proposes a novel graph-based transductive learning algorithm based on manifold regularization. First, the manifold regularization was introduced to probabilistic discriminant model for semi-supervised classification task. And then a variation of the expectation maximization (EM) algorithm was derived to solve the optimization problem, which leads to an iterative algorithm. Although our method is developed in probabilistic framework, there is no need to make assumption about the specific form of data distribution. Besides, the crucial updating formula has closed form. This method was evaluated for text categorization on two standard datasets, 20 news group and Reuters-21578. Experiments show that our approach outperforms the state-of-the-art graph-based transductive learning methods.
基金This work was supported by the National Natural Science Foundation of China(Grant No.61876091)China Postdoctoral Science Foundation(2019M651918).
文摘Manifold regularization(MR)provides a powerful framework for semi-supervised classification using both the labeled and unlabeled data.It constrains that similar instances over the manifold graph should share similar classification out-puts according to the manifold assumption.It is easily noted that MR is built on the pairwise smoothness over the manifold graph,i.e.,the smoothness constraint is implemented over all instance pairs and actually considers each instance pair as a single operand.However,the smoothness can be pointwise in nature,that is,the smoothness shall inherently occur“everywhere"to relate the behavior of each point or instance to that of its close neighbors.Thus in this paper,we attempt to de-velop a pointwise MR(PW_MR for short)for semi-supervised learning through constraining on individual local instances.In this way,the pointwise nature of smoothness is preserved,and moreover,by considering individual instances rather than instance pairs,the importance or contribution of individual instances can be introduced.Such importance can be described by the confidence for correct prediction,or the local density,for example.PW.MR provides a different way for implementing manifold smoothness Finally,empirical results show the competitiveness of PW_MR compared to pairwise MR.
基金supported by National Natural Science Foundation of China (Nos. 61877047, 61877046, 62106186 and 62063031)the Fundamental Research Funds for the Central Universities (Nos. JB210701 and JB210718)。
文摘This paper aims to propose a framework for manifold regularization(MR) based distributed semi-supervised learning(DSSL) using single layer feed-forward neural network(SLFNN). The proposed framework, denoted as DSSL-SLFNN is based on the SLFNN, MR framework, and distributed optimization strategy. Then, a series of algorithms are derived to solve DSSL problems. In DSSL problems, data consisting of labeled and unlabeled samples are distributed over a communication network, where each node has only access to its own data and can only communicate with its neighbors. In some scenarios, DSSL problems cannot be solved by centralized algorithms. According to the DSSL-SLFNN framework, each node over the communication network exchanges the initial parameters of the SLFNN with the same basis functions for semi-supervised learning(SSL). All nodes calculate the global optimal coefficients of the SLFNN by using distributed datasets and local updates. During the learning process, each node only exchanges local coefficients with its neighbors rather than raw data. It means that DSSL-SLFNN based algorithms work in a fully distributed fashion and are privacy preserving methods. Finally, several simulations are presented to show the efficiency of the proposed framework and the derived algorithms.
基金This work was supported by the National Natural Science Foundation of China(No.11771275)The second author thanks the partially support of Dutch Research Council(No.040.11.724).
文摘In general,data contain noises which come from faulty instruments,flawed measurements or faulty communication.Learning with data in the context of classification or regression is inevitably affected by noises in the data.In order to remove or greatly reduce the impact of noises,we introduce the ideas of fuzzy membership functions and the Laplacian twin support vector machine(Lap-TSVM).A formulation of the linear intuitionistic fuzzy Laplacian twin support vector machine(IFLap-TSVM)is presented.Moreover,we extend the linear IFLap-TSVM to the nonlinear case by kernel function.The proposed IFLap-TSVM resolves the negative impact of noises and outliers by using fuzzy membership functions and is a more accurate reasonable classi-fier by using the geometric distribution information of labeled data and unlabeled data based on manifold regularization.Experiments with constructed artificial datasets,several UCI benchmark datasets and MNIST dataset show that the IFLap-TSVM has better classification accuracy than other state-of-the-art twin support vector machine(TSVM),intuitionistic fuzzy twin support vector machine(IFTSVM)and Lap-TSVM.