结合半波高斯量化与交替更新的神经网络压缩方法  被引量:4

Neural Network Compression Method Combining Half-Wave Gaussian Quantization and Alternate Update

在线阅读下载全文

作  者:张红梅[1] 严海兵 张向利[1] ZHANG Hongmei;YAN Haibing;ZHANG Xiangli(Guangxi Colleges and Universities Key Laboratory of Cloud Computing and Complex Systems,Guilin University of Electronic Technology,Guilin,Guangxi 541004,China)

机构地区:[1]桂林电子科技大学广西高校云计算与复杂系统重点实验室,广西桂林541004

出  处:《计算机工程》2021年第5期80-87,共8页Computer Engineering

基  金:国家自然科学基金(61461010);认知无线电与信息处理省部共建教育部重点实验室基金(CRKL170103,CRKL170104);广西密码学与信息安全重点实验室基金(GCIS201626)。

摘  要:为使神经网络模型能在实时性要求较高且内存容量受限的边缘设备上部署使用,提出一种基于半波高斯量化与交替更新的混合压缩方法。对神经网络模型输入部分进行2 bit均匀半波高斯量化,将量化值输入带有缩放因子的二值网络通过训练得到初始二值模型,利用交替更新方法对已训练的二值模型进行逐层微调以提高模型测试精度。在CIFAR-10和ImageNet数据集上的实验结果表明,该方法能有效降低参数和结构冗余所导致的内存和时间开销,在神经网络模型压缩比接近30的前提下,测试精度相比HWGQ-Net方法提高0.8和2.0个百分点且实现了10倍的训练加速。To enable the deployment of neural network models on edge devices with a limited memory size and high real-time performance requirements,this paper proposes a hybrid compression method combining Half-Wave Gaussian Quantization(HWGQ)and alternate update.By performing the 2 bit uniform HWGQ on the input of the neural network model,the quantized value is input into a binary network with a scaling factor,which is trained to obtain the initial binary model.Then the trained binary model is fine-tuned layer by layer using the alternating update method to improve the accuracy of the model.Experimental results on the CIFAR-10 and ImageNet datasets show that the proposed method significantly reduces the memory consumption and time consumption caused by parameter redundancy and structural redundancy.When the model compression ratio is about 30,the accuracy of the model is increased by 0.8 and 2.0 percentage points compared with that of the HWGQ-Net method,and its training speed is increased by 10 times.

关 键 词:卷积神经网络 量化 模型压缩 半波高斯量化 交替更新 

分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象