期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Active label distribution learning via kernel maximum mean discrepancy
1
作者 Xinyue DONG Tingjin LUO +2 位作者 Ruidong FAN Wenzhang ZHUGE Chenping HOU 《Frontiers of Computer Science》 SCIE EI CSCD 2023年第4期69-81,共13页
Label distribution learning(LDL)is a new learning paradigm to deal with label ambiguity and many researches have achieved the prominent performances.Compared with traditional supervised learning scenarios,the annotati... Label distribution learning(LDL)is a new learning paradigm to deal with label ambiguity and many researches have achieved the prominent performances.Compared with traditional supervised learning scenarios,the annotation with label distribution is more expensive.Direct use of existing active learning(AL)approaches,which aim to reduce the annotation cost in traditional learning,may lead to the degradation of their performance.To deal with the problem of high annotation cost in LDL,we propose the active label distribution learning via kernel maximum mean discrepancy(ALDL-kMMD)method to tackle this crucial but rarely studied problem.ALDL-kMMD captures the structural information of both data and label,extracts the most representative instances from the unlabeled ones by incorporating the nonlinear model and marginal probability distribution matching.Besides,it is also able to markedly decrease the amount of queried unlabeled instances.Meanwhile,an effective solution is proposed for the original optimization problem of ALDL-kMMD by constructing auxiliary variables.The effectiveness of our method is validated with experiments on the real-world datasets. 展开更多
关键词 label distribution learning active learning maximum mean discrepancy auxiliary variable
原文传递
Tool Wear State Recognition with Deep Transfer Learning Based on Spindle Vibration for Milling Process
2
作者 Qixin Lan Binqiang Chen +1 位作者 Bin Yao Wangpeng He 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第3期2825-2844,共20页
The wear of metal cutting tools will progressively rise as the cutting time goes on. Wearing heavily on the toolwill generate significant noise and vibration, negatively impacting the accuracy of the forming and the s... The wear of metal cutting tools will progressively rise as the cutting time goes on. Wearing heavily on the toolwill generate significant noise and vibration, negatively impacting the accuracy of the forming and the surfaceintegrity of the workpiece. Hence, during the cutting process, it is imperative to continually monitor the tool wearstate andpromptly replace anyheavilyworn tools toguarantee thequality of the cutting.The conventional tool wearmonitoring models, which are based on machine learning, are specifically built for the intended cutting conditions.However, these models require retraining when the cutting conditions undergo any changes. This method has noapplication value if the cutting conditions frequently change. This manuscript proposes a method for monitoringtool wear basedonunsuperviseddeep transfer learning. Due to the similarity of the tool wear process under varyingworking conditions, a tool wear recognitionmodel that can adapt to both current and previous working conditionshas been developed by utilizing cutting monitoring data from history. To extract and classify cutting vibrationsignals, the unsupervised deep transfer learning network comprises a one-dimensional (1D) convolutional neuralnetwork (CNN) with a multi-layer perceptron (MLP). To achieve distribution alignment of deep features throughthe maximum mean discrepancy algorithm, a domain adaptive layer is embedded in the penultimate layer of thenetwork. A platformformonitoring tool wear during endmilling has been constructed. The proposedmethod wasverified through the execution of a full life test of end milling under multiple working conditions with a Cr12MoVsteel workpiece. Our experiments demonstrate that the transfer learning model maintains a classification accuracyof over 80%. In comparisonwith the most advanced tool wearmonitoring methods, the presentedmodel guaranteessuperior performance in the target domains. 展开更多
关键词 Multi-working conditions tool wear state recognition unsupervised transfer learning domain adaptation maximum mean discrepancy(MMD)
下载PDF
Knowledge Transfer Learning via Dual Density Sampling for Resource-Limited Domain Adaptation
3
作者 Zefeng Zheng Luyao Teng +2 位作者 Wei Zhang Naiqi Wu Shaohua Teng 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2023年第12期2269-2291,共23页
Most existing domain adaptation(DA) methods aim to explore favorable performance under complicated environments by sampling.However,there are three unsolved problems that limit their efficiencies:ⅰ) they adopt global... Most existing domain adaptation(DA) methods aim to explore favorable performance under complicated environments by sampling.However,there are three unsolved problems that limit their efficiencies:ⅰ) they adopt global sampling but neglect to exploit global and local sampling simultaneously;ⅱ)they either transfer knowledge from a global perspective or a local perspective,while overlooking transmission of confident knowledge from both perspectives;and ⅲ) they apply repeated sampling during iteration,which takes a lot of time.To address these problems,knowledge transfer learning via dual density sampling(KTL-DDS) is proposed in this study,which consists of three parts:ⅰ) Dual density sampling(DDS) that jointly leverages two sampling methods associated with different views,i.e.,global density sampling that extracts representative samples with the most common features and local density sampling that selects representative samples with critical boundary information;ⅱ)Consistent maximum mean discrepancy(CMMD) that reduces intra-and cross-domain risks and guarantees high consistency of knowledge by shortening the distances of every two subsets among the four subsets collected by DDS;and ⅲ) Knowledge dissemination(KD) that transmits confident and consistent knowledge from the representative target samples with global and local properties to the whole target domain by preserving the neighboring relationships of the target domain.Mathematical analyses show that DDS avoids repeated sampling during the iteration.With the above three actions,confident knowledge with both global and local properties is transferred,and the memory and running time are greatly reduced.In addition,a general framework named dual density sampling approximation(DDSA) is extended,which can be easily applied to other DA algorithms.Extensive experiments on five datasets in clean,label corruption(LC),feature missing(FM),and LC&FM environments demonstrate the encouraging performance of KTL-DDS. 展开更多
关键词 Cross-domain risk dual density sampling intra-domain risk maximum mean discrepancy knowledge transfer learning resource-limited domain adaptation
下载PDF
Generative Adversarial Networks with Joint Distribution Moment Matching
4
作者 Yi-Ying Zhang Chao-Min Shen +2 位作者 Hao Feng Preston Thomas Fletcher Gui-Xu Zhang 《Journal of the Operations Research Society of China》 EI CSCD 2019年第4期579-597,共19页
Generative adversarial networks(GANs)have shown impressive power in the field of machine learning.Traditional GANs have focused on unsupervised learning tasks.In recent years,conditional GANs that can generate data wi... Generative adversarial networks(GANs)have shown impressive power in the field of machine learning.Traditional GANs have focused on unsupervised learning tasks.In recent years,conditional GANs that can generate data with labels have been proposed in semi-supervised learning and have achieved better image quality than traditional GANs.Conditional GANs,however,generally only minimize the difference between marginal distributions of real and generated data,neglecting the difference with respect to each class of the data.To address this challenge,we propose the GAN with joint distribution moment matching(JDMM-GAN)for matching the joint distribution based on maximum mean discrepancy,which minimizes the differences of both the marginal and conditional distributions.The learning procedure is iteratively conducted by the stochastic gradient descent and back-propagation.We evaluate JDMM-GAN on several benchmark datasets,including MNIST,CIFAR-10 and the Extended Yale Face.Compared with the state-of-the-art GANs,JDMM-GAN generates more realistic images and achieves the best inception score for CIFAR-10 dataset. 展开更多
关键词 Generative Adversarial Networks Joint Distribution Moment Matching maximum mean discrepancy
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部