检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张彪[1,2] 杨朋波[1,2] 桑基韬 于剑 Biao ZHANG;Pengbo YANG;Jitao SANG;Jian YU(School of Computer and Information Technology,Beijing Jiaotong University,Beijing 100044,China;Beijing Key Laboratory of Traffic Data Analysis and Mining(Beijing Jiaotong University),Beijing 100044,China)
机构地区:[1]北京交通大学计算机与信息技术学院,北京100044 [2]交通数据分析与挖掘北京市重点实验室(北京交通大学),北京100044
出 处:《中国科学:信息科学》2021年第1期13-26,共14页Scientia Sinica(Informationis)
基 金:国家重点研发计划(批准号:2017YFC1703506);国家自然科学基金重点项目(批准号:61632004,61832002,61672518)资助。
摘 要:近几年,深度模型在诸多任务中取得了巨大成功,但是深度模型需要大量的存储和计算资源实现精确决策,研究者为了将深度模型应用到资源受限的终端设备中,设计了模型压缩的优化策略来降低模型占存和计算量.本文基于剪枝压缩框架,从卷积核重要度评价的角度提出了两种模型剪枝算法.(1)由于每个卷积核都可以学习到其独有特征信息,因此本文提出了一种归因评价机制用于评价卷积核所学特征与因果特征的相关度,将模型中与因果特征相关度较低的卷积核进行裁剪,以实现模型压缩的目的,同时也能够保留原模型的归因特征,称此算法为归因剪枝.(2)第2种剪枝算法基于迭代优化剪枝框架,采用卷积通道和梯度中正相关特征评价相应卷积核重要度,以便于提高剪枝冗余卷积核的精准度,称为Taylor-guided剪枝算法.本文在VGGNet和ResNet两种网络架构上进行实验验证,结果表明:归因剪枝算法可以极大地保留原模型的归因特征;并且两种剪枝算法能够取得比当前主流剪枝算法更优异的压缩效果.Although deep learning models have recently achieved remarkable performance in many tasks,they require massive memory footprint and computing power to achieve efficient inference.The researchers propose a number of compression methods to compress the capacity and computation of the model so that deep learning can be deployed to resource-constrained mobile terminals.Based on the pruning framework,two pruning methods are proposed from the perspective of filter importance evaluation.(1)As every filter can learn unique features,we propose an attribution mechanism to evaluate the correlation between the features learned by a filter and the causal features.We prune the filter with low correlation so as to compress the model and retain the attribution characteristics of the original model;the process is called attribution pruning.(2)The second pruning method uses positive correlation features in a channel and gradient to evaluate the importance of the filter,which is based on an iterative optimization pruning framework.This method,which is called Taylor-guided pruning,can improve the accuracy of pruning redundant filters.We implement two pruning methods in VGGNet and ResNet.Extensive experiments demonstrate that attribution pruning can greatly retain the attribution characteristics of the original model.Moreover,the two pruning methods can achieve better compression than current mainstream pruning methods.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.116.20.44