检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张玉 武海 林凡超 黄福玉 刘毅志[4] Zhang Yu;Wu Hai;Lin Fanchao;Huang Fuyu;Liu Yizhi(College of Information Science and Technology,Zhengzhou Normal University,Zhengzhou 450044,China;Department of Information Science and Technology,University of Science and Technology of China,Hefei 230026,China;Beijing Research Institute,University of Science and Technology of China,Beijing 100193,China;Department of Computer Science and Engineering,Hunan University of Science and Technology,Xiangtan 411201,China)
机构地区:[1]郑州师范学院信息科学与技术学院,河南郑州450044 [2]中国科学技术大学信息科学技术学院,安徽合肥230026 [3]北京中科研究院,北京100094 [4]湖南科技大学计算机科学与工程学院,湖南湘潭411201
出 处:《南京理工大学学报》2023年第5期699-707,共9页Journal of Nanjing University of Science and Technology
基 金:国家自然科学基金重点项目(U19B2023);河南省本科高校青年骨干教师培养计划项目(2021GGJS170);湖南省教育厅科学研究重点项目(19A172)。
摘 要:为了探索用少量的图像数据指导模型剪枝,同时缩短确定裁剪哪些卷积核的时间,该文提出了一种基于卷积核输出特征图的和值的期望进行模型剪枝的策略。将少量的图像输入剪枝前的深度学习模型中,将同一层卷积核输出的特征图根据和值的期望进行排序,按照一定的剪枝率剪去较小期望值对应的卷积核。根据该文提出的模型剪枝策略,在3个通用的公开数据集CIFAR-10、CIFAR-100、ILSVRC-2012上进行了测试,并与目前主流的一些模型剪枝算法进行了对比。实验证明,该文提出的模型剪枝策略在VGG-16-BN上参数量压缩87.3%,每秒浮点运算次数(FLOPs)压缩78.6%,该模型在CIFAR-10上仍能达到93.19%的分类识别精度。在CIFAR-100数据集上,模型剪枝策略在ResNet-56上FLOPs压缩67%,仍能达到67.96%的分类识别精度。To explore guiding model pruning with a small amount of image data,while reducing the time to determine which convolution kernels to prune,this paper proposes a strategy for model pruning based on the expected values of the sum of the feature maps output by the convolution kernel.In this paper,a small number of images are input into the deep learning model before pruning,the feature maps output by the convolution kernel of the same layer are sorted according to the expected values of summation results,and the kernels corresponding to small expected values are pruned with a certain pruning rate.Based on the model pruning strategy proposed in this paper,the evaluation is conducted on three general public datasets CIFAR-10,CIFAR-100,ILSVRC-2012,and this paper compares the strategy with some current mainstream network pruning algorithms.Experiments show that the proposed pruning strategy compresses 87.3%in parameters and 78.6%in FLOPs on VGG-16-BN,and the pruned model can still reach an accuracy of 93.19%on CIFAR-10.On the CIFAR-100 dataset,ResNet-56 with 67%FLOPs compression can achieve the classification and recognition accuracy of 67.96%.
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.223.211.185