检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Wujie SUN Defang CHEN Can WANG Deshi YE Yan FENG Chun CHEN
机构地区:[1]College of Computer Science and Technology,Zhejiang University,Hangzhou 310000,China
出 处:《Frontiers of Information Technology & Electronic Engineering》2024年第4期585-599,共15页信息与电子工程前沿(英文版)
基 金:supported by the National Natural Science Foundation of China(No.U1866602);the Starry Night Science Fund of Zhejiang University Shanghai Institute for Advanced Study,China(No.SN-ZJU-SIAS-001)。
摘 要:Multi-exit architecture allows early-stop inference to reduce computational cost,which can be used in resource-constrained circumstances.Recent works combine the multi-exit architecture with self-distillation to simultaneously achieve high efficiency and decent performance at different network depths.However,existing methods mainly transfer knowledge from deep exits or a single ensemble to guide all exits,without considering that inappropriate learning gaps between students and teachers may degrade the model performance,especially in shallow exits.To address this issue,we propose Multi-exit self-distillation with Appropriate TEachers(MATE)to provide diverse and appropriate teacher knowledge for each exit.In MATE,multiple ensemble teachers are obtained from all exits with different trainable weights.Each exit subsequently receives knowledge from all teachers,while focusing mainly on its primary teacher to keep an appropriate gap for efficient knowledge transfer.In this way,MATE achieves diversity in knowledge distillation while ensuring learning efficiency.Experimental results on CIFAR-100,TinyImageNet,and three fine-grained datasets demonstrate that MATE consistently outperforms state-of-the-art multi-exit self-distillation methods with various network architectures.
关 键 词:Multi-exit architecture Knowledge distillation Learning gap
分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.13