A Short Text Classification Model Based on Chinese Part-of-Speech Information and Mutual Learning  

在线阅读下载全文

作  者:Yihe Deng Zuxu Dai 

机构地区:[1]School of Mathematics and Physics,Wuhan Institute of Technology,Wuhan 430205,China

出  处:《国际计算机前沿大会会议论文集》2023年第2期330-343,共14页International Conference of Pioneering Computer Scientists, Engineers and Educators(ICPCSEE)

摘  要:Short text classification is one of the common tasks in natural language processing.Short text contains less information,and there is still much room for improvement in the performance of short text classification models.This paper proposes a new short text classification model ML-BERT based on the idea of mutual learning.ML-BERT includes a BERT that only uses word vector informa-tion and a BERT that fuses word information and part-of-speech information and introduces transmissionflag to control the information transfer between the two BERTs to simulate the mutual learning process between the two models.Experi-mental results show that the ML-BERT model obtains a MAF1 score of 93.79%on the THUCNews dataset.Compared with the representative models Text-CNN,Text-RNN and BERT,the MAF1 score improves by 8.11%,6.69%and 1.69%,respectively.

关 键 词:Natural language processing Neural network Chinese short text classification BERT Mutual deep learning 

分 类 号:TP3[自动化与计算机技术—计算机科学与技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象