期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Hformer:highly efficient vision transformer for low-dose CT denoising
1
作者 Shi-Yu Zhang Zhao-Xuan Wang +5 位作者 Hai-Bo Yang Yi-Lun Chen Yang Li Quan Pan Hong-Kai Wang Cheng-Xin Zhao 《Nuclear Science and Techniques》 SCIE EI CAS CSCD 2023年第4期161-174,共14页
In this paper,we propose Hformer,a novel supervised learning model for low-dose computer tomography(LDCT)denoising.Hformer combines the strengths of convolutional neural networks for local feature extraction and trans... In this paper,we propose Hformer,a novel supervised learning model for low-dose computer tomography(LDCT)denoising.Hformer combines the strengths of convolutional neural networks for local feature extraction and transformer models for global feature capture.The performance of Hformer was verified and evaluated based on the AAPM-Mayo Clinic LDCT Grand Challenge Dataset.Compared with the former representative state-of-the-art(SOTA)model designs under different architectures,Hformer achieved optimal metrics without requiring a large number of learning parameters,with metrics of33.4405 PSNR,8.6956 RMSE,and 0.9163 SSIM.The experiments demonstrated designed Hformer is a SOTA model for noise suppression,structure preservation,and lesion detection. 展开更多
关键词 Low-dose CT Deep learning Medical image Image denoising Convolutional neural networks selfattention Residual network Auto-encoder
下载PDF
改进的CRNN模型在警情文本分类中的研究与应用 被引量:1
2
作者 王孟轩 张胜 +2 位作者 王月 雷霆 杜渂 《应用科学学报》 CAS CSCD 北大核心 2020年第3期388-400,共13页
针对某市公安110接处警文本描述进行案件分类的需求,参考现有文本分类方法在其他行业的应用,搭建了应用于警情描述的文本分类系统.通过论证常见分类网络适用场合及其优缺点,结合对警情数据中案件描述特征的分析,提出了基于改进卷积循环... 针对某市公安110接处警文本描述进行案件分类的需求,参考现有文本分类方法在其他行业的应用,搭建了应用于警情描述的文本分类系统.通过论证常见分类网络适用场合及其优缺点,结合对警情数据中案件描述特征的分析,提出了基于改进卷积循环神经网络的模型,该模型优化了关键特征提取过程,弥补了现有模型短文本局部特征提取不足的缺陷.实验表明,该模型的准确率比常见分类模型提升了2%~3%,且能够有效保证数据局部特征的关联性,可以对案件描述所对应的案件类型进行准确分类,从而提高公安接处警平台的自动化效率. 展开更多
关键词 警情文本处理 文本分类 卷积神经网络 双向长短时记忆 selfattention
下载PDF
Towards better entity linking
3
作者 Mingyang LI Yuqing XING +1 位作者 Fang KONG Guodong ZHOU 《Frontiers of Computer Science》 SCIE EI CSCD 2022年第2期55-67,共13页
As one of the most important components in knowledge graph construction,entity linking has been drawing more and more attention in the last decade.In this paper,we propose two improvements towards better entity linkin... As one of the most important components in knowledge graph construction,entity linking has been drawing more and more attention in the last decade.In this paper,we propose two improvements towards better entity linking.On one hand,we propose a simple but effective coarse-to-fine unsupervised knowledge base(KB)extraction approach to improve the quality of KB,through which we can conduct entity linking more efficiently.On the other hand,we propose a highway network framework to bridge key words and sequential information captured with a self-attention mechanism to better represent both local and global information.Detailed experimentation on six public entity linking datasets verifies the great effectiveness of both our approaches. 展开更多
关键词 entity linking knowledge base extraction selfattention mechanism highway network
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部