期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Slope displacement prediction based on multisource domain transfer learning for insufficient sample data
1
作者 Zheng Hai-Qing Hu Lin-Ni +2 位作者 Sun Xiao-Yun Zhang Yu Jin Shen-Yi 《Applied Geophysics》 SCIE CSCD 2024年第3期496-504,618,共10页
Accurate displacement prediction is critical for the early warning of landslides.The complexity of the coupling relationship between multiple influencing factors and displacement makes the accurate prediction of displ... Accurate displacement prediction is critical for the early warning of landslides.The complexity of the coupling relationship between multiple influencing factors and displacement makes the accurate prediction of displacement difficult.Moreover,in engineering practice,insufficient monitoring data limit the performance of prediction models.To alleviate this problem,a displacement prediction method based on multisource domain transfer learning,which helps accurately predict data in the target domain through the knowledge of one or more source domains,is proposed.First,an optimized variational mode decomposition model based on the minimum sample entropy is used to decompose the cumulative displacement into the trend,periodic,and stochastic components.The trend component is predicted by an autoregressive model,and the periodic component is predicted by the long short-term memory.For the stochastic component,because it is affected by uncertainties,it is predicted by a combination of a Wasserstein generative adversarial network and multisource domain transfer learning for improved prediction accuracy.Considering a real mine slope as a case study,the proposed prediction method was validated.Therefore,this study provides new insights that can be applied to scenarios lacking sample data. 展开更多
关键词 slope displacement multisource domain transfer learning(MDTL) variational mode decomposition(VMD) generative adversarial network(GAN) Wasserstein-GAN
下载PDF
Effective Face-to-faceTutorials of Online English Learning Lie in Interaction of Affective Domain and Higher Levels of Cognitive Domain of Bloom's Taxonomy
2
作者 Jiali Ding 《Sino-US English Teaching》 2004年第2期39-50,共12页
Online English learning as an outcome of the rapid development of the Internet has got a wider and wider market in China. However, problems of varieties have also occurred along its way. People never stop thinking of ... Online English learning as an outcome of the rapid development of the Internet has got a wider and wider market in China. However, problems of varieties have also occurred along its way. People never stop thinking of better strategies either in designing online course wares or tutorials to help smooth the learning process. My experience as a tutor is that interaction of affective domain and higher levels of cognitive domain of Bloom's Taxonomy plays an important role in face-to-face tutorials of online English learning. 展开更多
关键词 online English learning face-to-face tutorials Bloom's Taxonomy of educational objectives Affective domain higher levels of Cognitive domain
下载PDF
Incremental Learning Based on Data Translation and Knowledge Distillation
3
作者 Tan Cheng Jielong Wang 《International Journal of Intelligence Science》 2023年第2期33-47,共15页
Recently, deep convolutional neural networks (DCNNs) have achieved remarkable results in image classification tasks. Despite convolutional networks’ great successes, their training process relies on a large amount of... Recently, deep convolutional neural networks (DCNNs) have achieved remarkable results in image classification tasks. Despite convolutional networks’ great successes, their training process relies on a large amount of data prepared in advance, which is often challenging in real-world applications, such as streaming data and concept drift. For this reason, incremental learning (continual learning) has attracted increasing attention from scholars. However, incremental learning is associated with the challenge of catastrophic forgetting: the performance on previous tasks drastically degrades after learning a new task. In this paper, we propose a new strategy to alleviate catastrophic forgetting when neural networks are trained in continual domains. Specifically, two components are applied: data translation based on transfer learning and knowledge distillation. The former translates a portion of new data to reconstruct the partial data distribution of the old domain. The latter uses an old model as a teacher to guide a new model. The experimental results on three datasets have shown that our work can effectively alleviate catastrophic forgetting by a combination of the two methods aforementioned. 展开更多
关键词 Incremental domain learning Data Translation Knowledge Distillation Cat-astrophic Forgetting
下载PDF
Layer-wise domain correction for unsupervised domain adaptation 被引量:1
4
作者 Shuang LI Shi-ji SONG Cheng WU 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2018年第1期91-103,共13页
Deep neural networks have been successfully applied to numerous machine learning tasks because of their impressive feature abstraction capabilities.However,conventional deep networks assume that the training and test ... Deep neural networks have been successfully applied to numerous machine learning tasks because of their impressive feature abstraction capabilities.However,conventional deep networks assume that the training and test data are sampled from the same distribution,and this assumption is often violated in real-world scenarios.To address the domain shift or data bias problems,we introduce layer-wise domain correction(LDC),a new unsupervised domain adaptation algorithm which adapts an existing deep network through additive correction layers spaced throughout the network.Through the additive layers,the representations of source and target domains can be perfectly aligned.The corrections that are trained via maximum mean discrepancy,adapt to the target domain while increasing the representational capacity of the network.LDC requires no target labels,achieves state-of-the-art performance across several adaptation benchmarks,and requires significantly less training time than existing adaptation methods. 展开更多
关键词 Unsupervised domain adaptation Maximum mean discrepancy Residual network Deep learning
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部