Accelerating BERT inference with GPU-efficient exit prediction  

在线阅读下载全文

作  者:Lei LI Chengyu WANG Minghui QIU Cen CHEN Ming GAO Aoying ZHOU 

机构地区:[1]Shanghai Engineering Research Center of Big Data Management,School of Data Science and Engineering,East China Normal University,Shanghai 200062,China [2]Alibaba Group,Hangzhou 311121,China [3]KLATASDS-MOE,School of Statistics,East China Normal University,Shanghai 200062,China

出  处:《Frontiers of Computer Science》2024年第3期31-42,共12页中国计算机科学前沿(英文版)

基  金:supported by the National Natural Science Foundation of China(Grant Nos.U1911203,61877018,61977025,62202170);Alibaba Group through the Alibaba Innovation Research Program.

摘  要:BERT is a representative pre-trained language model that has drawn extensive attention for significant improvements in downstream Natural Language Processing(NLP)tasks.The complex architecture and massive parameters bring BERT competitive performance but also result in slow speed at model inference time.To speed up BERT inference,FastBERT realizes adaptive inference with an acceptable drop in accuracy based on knowledge distillation and the early-exit technique.However,many factors may limit the performance of FastBERT,such as the teacher classifier that is not knowledgeable enough,the batch size shrinkage and the redundant computation of student classifiers.To overcome these limitations,we propose a new BERT inference method with GPU-Efficient Exit Prediction(GEEP).GEEP leverages the shared exit loss to simplify the training process of FastBERT from two steps into only one step and makes the teacher classifier more knowledgeable by feeding diverse Transformer outputs to the teacher classifier.In addition,the exit layer prediction technique is proposed to utilize a GPU hash table to handle the token-level exit layer distribution and to sort test samples by predicted exit layers.In this way,GEEP can avoid batch size shrinkage and redundant computation of student classifiers.Experimental results on twelve public English and Chinese NLP datasets prove the effectiveness of the proposed approach.The source codes of GEEP will be released to the public upon paper acceptance.

关 键 词:BERT FastBERT inference acceleration model distillation early exit text classification 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象