Language-Independent Text Tokenization Using Unsupervised Deep Learning  

在线阅读下载全文

作  者:Hanan A.Hosni Mahmoud Alaaeldin M.Hafez Eatedal Alabdulkreem 

机构地区:[1]Department of Computer Sciences,College of Computer and Information Sciences,Princess Nourah bint Abdulrahman University,P.O.Box 84428,Riyadh,11671,Saudi Arabia [2]Department of Information Systems,College of Computer and Information Sciences,King Saud University,Riyadh,Saudi Arabia

出  处:《Intelligent Automation & Soft Computing》2023年第1期321-334,共14页智能自动化与软计算(英文)

基  金:funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2022R113),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.

摘  要:Languages–independent text tokenization can aid in classification of languages with few sources.There is a global research effort to generate text classification for any language.Human text classification is a slow procedure.Conse-quently,the text summary generation of different languages,using machine text classification,has been considered in recent years.There is no research on the machine text classification for many languages such as Czech,Rome,Urdu.This research proposes a cross-language text tokenization model using a Transformer technique.The proposed Transformer employs an encoder that has ten layers with self-attention encoding and a feedforward sublayer.This model improves the efficiency of text classification by providing a draft text classification for a number of documents.We also propose a novel Sub-Word tokenization model with frequent vocabulary usage in the documents.The Sub-Word Byte-Pair Tokenization technique(SBPT)utilizes the sharing of the vocabulary of one sentence with other sentences.The Sub-Word tokenization model enhances the performance of other Sub-Word tokenization models such pair encoding model by+10%using precision metric.

关 键 词:Text classification language-independent tokenization sub word tokenization 

分 类 号:TP391.1[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象