检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:闫雅茹 Yaru Yan(School of Optical-Electrical and Computer Engineering,University of Shanghai for Science and Technology,Shanghai)
出 处:《建模与仿真》2025年第1期353-365,共13页Modeling and Simulation
摘 要:深度神经网络在计算机视觉任务中广泛应用,但是大规模参数计算导致的高复杂性限制了其在资源有限环境中的部署。本文提出了一种平衡幅度和相似度的滤波器剪枝方法(MASFIP)。在每次剪枝迭代中,通过缩放因子α选择每层临时剪枝的滤波器。根据网络损失进行永久性剪枝,达到预定的浮点运算量后,采取少量的再训练步骤缓解模型精度的急剧下降。在CIFAR-10和CIFAR-100数据集上对VGGNet-16和ResNet模型进行剪枝实验,结果表明在CIFAR-10数据集上,MASFIP分别从VGGNet-16和ResNet-56中删除了60.6%和52.9%的FLOPs,精度提高了0.16%和0.14%。在CIFAR-100数据集上,从ResNet-56中删除了39.1%的FLOPs,仅导致0.05%的精度下降。There are extensive applications of Deep Neural Network(DNN)in the field of computer vision tasks.However,the high complexity resulting from the computing large-scale parameters of DNN would hinder its deployment in resource-constrained environments.We propose a filter pruning method,named Balancing Magnitude and Similarity for Filter Pruning(MASFIP),to address this challenge.During each pruning iteration,filters for temporary pruning are selected using a scaling factor.Permanent pruning is then performed based on network loss.Upon reaching the designated floating-point operations,a small number of retraining steps are taken to alleviate the sharp decline in model accuracy.Experimental pruning on VGGNet-16 and ResNet models on CIFAR-10 and CIFAR-100 datasets reveals that,on CIFAR-10,MASFIP removes 60.6%and 52.9%of FLOPs from VGGNet-16 and ResNet-56 respectively,resulting in accuracy improvements of 0.16%and 0.14%.On CIFAR-100,pruning from ResNet-56 leads to a reduction of 39.1%in FLOPs with only a marginal accuracy drop of 0.05%.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.171