期刊文献+

多分支轻量级残差网络的手写字符识别方法 被引量:1

Handwritten Character Recognition Method Based on Multi-Branch Lightweight Residual Network
下载PDF
导出
摘要 由于手写数字容易出现粘连现象,影响了此类字符的分割和识别精度;另一方面,深度学习模型通常计算复杂度较高,导致其无法在资源受限的设备上高效运行。针对上述问题,提出一种多分支轻量级残差网络的手写字符识别方法。针对字符粘连问题制作了90类复合数字,将其与MNIST和7种算术符号混合作为实验数据集。将ResNet残差结构和注意力机制融合,借用Inception思想,采用多分支结构,提高网络的特征学习能力,并将网络通过知识蒸馏来学习深度神经网络ResNet。在对107类手写字符数据集上的实验证明,该方法能达到深度网络的高精度,同时模型复杂度大大降低,实现在树莓派等低配置终端上的高精度识别效果。 Handwritten numerals are prone to adhesion, which affects the precision of segmentation and recognition. On the other hand, deep learning models usually have high computational complexity, which makes them unable to run efficiently on resource constrained devices. For the above problems, a handwritten character identification method of a multi-branch lightweight residual network is proposed. A 90-class composite number is made for character adhesion, and it is mixed with MNIST and 7 arithmetic symbols as experimental data sets. It fuses ResNet residual structure and attention mechanism, borrows Inception thinking, uses multi-branch structure, which improves the network’s characteristic learning capabilities, and learns the deep neural network ResNet through knowledge distillation. Experiments on 107 types of handwritten character data sets prove that this method can achieve the high precision of deep networks, and the model complexity is greatly reduced, and the high precision recognition effect on the low configuration terminal of the Raspberry Pi.
作者 黎光艳 王修晖 LI Guangyan;WANG Xiuhui(Key Laboratory of Electromagnetic Wave Information Technology and Metrology of Zhejiang Province,College of Information Engineering,China Jiliang University,Hangzhou 310018,China)
出处 《计算机工程与应用》 CSCD 北大核心 2023年第5期115-121,共7页 Computer Engineering and Applications
基金 浙江省自然科学基金(LY20F020018) 浙江省重点研发计划(2021C03151)。
关键词 手写字符识别 残差结构 注意力机制 ResNet 知识蒸馏 handwritten character recognition residual structure attention mechanism ResNet knowledge distillation
  • 相关文献

参考文献3

二级参考文献9

共引文献1651

同被引文献13

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部