多分支轻量级残差网络的手写字符识别方法  被引量:3

Handwritten Character Recognition Method Based on Multi-Branch Lightweight Residual Network

在线阅读下载全文

作  者:黎光艳 王修晖 LI Guangyan;WANG Xiuhui(Key Laboratory of Electromagnetic Wave Information Technology and Metrology of Zhejiang Province,College of Information Engineering,China Jiliang University,Hangzhou 310018,China)

机构地区:[1]中国计量大学信息工程学院浙江省电磁波信息技术与计量检测重点实验室,杭州310018

出  处:《计算机工程与应用》2023年第5期115-121,共7页Computer Engineering and Applications

基  金:浙江省自然科学基金(LY20F020018);浙江省重点研发计划(2021C03151)。

摘  要:由于手写数字容易出现粘连现象,影响了此类字符的分割和识别精度;另一方面,深度学习模型通常计算复杂度较高,导致其无法在资源受限的设备上高效运行。针对上述问题,提出一种多分支轻量级残差网络的手写字符识别方法。针对字符粘连问题制作了90类复合数字,将其与MNIST和7种算术符号混合作为实验数据集。将ResNet残差结构和注意力机制融合,借用Inception思想,采用多分支结构,提高网络的特征学习能力,并将网络通过知识蒸馏来学习深度神经网络ResNet。在对107类手写字符数据集上的实验证明,该方法能达到深度网络的高精度,同时模型复杂度大大降低,实现在树莓派等低配置终端上的高精度识别效果。Handwritten numerals are prone to adhesion, which affects the precision of segmentation and recognition. On the other hand, deep learning models usually have high computational complexity, which makes them unable to run efficiently on resource constrained devices. For the above problems, a handwritten character identification method of a multi-branch lightweight residual network is proposed. A 90-class composite number is made for character adhesion, and it is mixed with MNIST and 7 arithmetic symbols as experimental data sets. It fuses ResNet residual structure and attention mechanism, borrows Inception thinking, uses multi-branch structure, which improves the network’s characteristic learning capabilities, and learns the deep neural network ResNet through knowledge distillation. Experiments on 107 types of handwritten character data sets prove that this method can achieve the high precision of deep networks, and the model complexity is greatly reduced, and the high precision recognition effect on the low configuration terminal of the Raspberry Pi.

关 键 词:手写字符识别 残差结构 注意力机制 ResNet 知识蒸馏 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象