DISTILLATION

作品数:339被引量:442H指数:10
导出分析报告
相关作者:何洪巨武晓颖金幼菊李学成刘晓媛更多>>
相关机构:国家蔬菜工程技术研究中心天津大学北京理工大学北京林业大学更多>>
相关期刊:更多>>
相关基金:国家自然科学基金国家重点基础研究发展计划国家高技术研究发展计划中国博士后科学基金更多>>
-

检索结果分析

结果分析中...
条 记 录,以下是1-10
视图:
排序:
Ensemble Knowledge Distillation for Federated Semi-Supervised Image Classification
《Tsinghua Science and Technology》2025年第1期112-123,共12页Ertong Shang Hui Liu Jingyang Zhang Runqi Zhao Junzhao Du 
supported by the National Natural Science Foundation of China(Nos.62032017 and 62272368);Key Talent Project of Xidian University(No.QTZX24004);Innovation Capability Support Program of Shaanxi(No.2023-CX-TD-08);Shaanxi Qinchuangyuan“Scientists+Engineers”Team(No.2023KXJ-040);Science and Technology Program of Xi’an(No.23KGDW0005-2022).
Federated learning is an emerging privacy-preserving distributed learning paradigm,in which many clients collaboratively train a shared global model under the orchestration of a remote server.Most current works on fed...
关键词:federated learning semi-supervised learning federated semi-supervised learning knowledge distillation 
KD-Crowd:a knowledge distillation framework for learning from crowds
《Frontiers of Computer Science》2025年第1期119-130,共12页Shaoyuan LI Yuxiang ZHENG Ye SHI Shengjun HUANG Songcan CHEN 
supported by the National Key R&D Program of China(2022ZD0114801);the National Natural Science Foundation of China(Grant No.61906089);the Jiangsu Province Basic Research Program(BK20190408).
Recently, crowdsourcing has established itself as an efficient labeling solution by distributing tasks to crowd workers. As the workers can make mistakes with diverse expertise, one core learning task is to estimate e...
关键词:crowdsourcing label noise worker expertise knowledge distillation robust learning 
KD-SegNet: Efficient Semantic Segmentation Network with Knowledge Distillation Based on Monocular Camera
《Computers, Materials & Continua》2025年第2期2001-2026,共26页Thai-Viet Dang Nhu-Nghia Bui Phan Xuan Tan 
funded by Hanoi University of Science and Technology(HUST)under project number T2023-PC-008.
Due to the necessity for lightweight and efficient network models, deploying semantic segmentation models on mobile robots (MRs) is a formidable task. The fundamental limitation of the problem lies in the training per...
关键词:Mobile robot navigation semantic segmentation knowledge distillation pyramid scene parsing fully convolutional networks 
Unsupervised Low-Light Image Enhancement Based on Explicit Denoising and Knowledge Distillation
《Computers, Materials & Continua》2025年第2期2537-2554,共18页Wenkai Zhang Hao Zhang Xianming Liu Xiaoyu Guo Xinzhe Wang Shuiwang Li 
support by the Guangxi Natural Science Foundation(Grant No.2024GXNSFAA010484);the NationalNatural Science Foundation of China(No.62466013),this work has been made possible.
Under low-illumination conditions, the quality of image signals deteriorates significantly, typically characterized by a peak signal-to-noise ratio (PSNR) below 10 dB, which severely limits the usability of the images...
关键词:Deep learning low-light image enhancement real-time processing knowledge distillation 
Optimizing BERT for Bengali Emotion Classification: Evaluating Knowledge Distillation, Pruning, and Quantization
《Computer Modeling in Engineering & Sciences》2025年第2期1637-1666,共30页Md Hasibur Rahman Mohammed Arif Uddin Zinnat Fowzia Ria Rashedur M.Rahman 
The rapid growth of digital data necessitates advanced natural language processing(NLP)models like BERT(Bidi-rectional Encoder Representations from Transformers),known for its superior performance in text classificati...
关键词:Bengali NLP black-box distillation emotion classification model compression post-training quantization unstructured pruning 
Success of DeepSeek and potential benefits of free access to AI for global-scale use
《International Journal of Agricultural and Biological Engineering》2025年第1期304-306,共3页Samuel Ariyo Okaiyeto Junwen Bai Jun Wang Arun S Mujumdar Hongwei Xiao 
The introduction of DeepSeek R1,an AI language model developed by the Chinese AI lab DeepSeek,has made a significant impact in the tech world[1].Within a week of its release,the app surged to the top of download chart...
关键词:AI DeepSeek reinforcement learning model distillation free access global-scale utilization 
State surveillance and fault diagnosis of distillation columns using residual network-based passive acoustic monitoring
《Chinese Journal of Chemical Engineering》2025年第1期248-258,共11页Haotian Zheng Zhixi Zhang Guangyan Wang Yatao Wang Jun Liang Weiyi Su Yuqi Hu Xiong Yu Chunli Li Honghai Wang 
the National Natural Science Foundation of China(22308079);the Natural Science Foundation of Hebei Province,China(B2022202008,B2023202025);the Science and Technology Project of Hebei Education Department,China(BJK2022037).
The operational state of distillation columns significantly impacts product quality and production efficiency.However,due to the complex operation and diverse influencing factors,ensuring the safety and efficient oper...
关键词:DISTILLATION COLUMN Acoustic signal Neural network 
Experiments on thermal miscible rules of different gas media and crude oil
《Petroleum Exploration and Development》2024年第6期1556-1563,共8页XI Changfeng ZHAO Fang WANG Bojun LIU Tong QI Zongyao LIU Peng 
Supported by the Petro China Science and Technology Project(2023ZG18)。
The high temperature and high pressure visualization pressure-volume-temperature(PVT)experiments of different gas media-crude oil were carried using the interface disappearance method.There are two miscible temperatur...
关键词:thermal miscible flooding miscible rules miscible zone PVT experiment distillation phase transition minimum miscible pressure minimum miscible temperature 
Big2Small:Learning from masked image modelling with heterogeneous self‐supervised knowledge distillation
《IET Cyber-Systems and Robotics》2024年第4期62-72,共11页Ziming Wang Shumin Han Xiaodi Wang Jing Hao Xianbin Cao Baochang Zhang 
Small convolutional neural network(CNN)-based models usually require transferring knowledge from a large model before they are deployed in computationally resourcelimited edge devices.Masked image modelling(MIM)method...
关键词:artificial intelligence deep neural network machine intelligence machine learning VISION 
MMDistill:Multi-Modal BEV Distillation Framework for Multi-View 3D Object Detection
《Computers, Materials & Continua》2024年第12期4307-4325,共19页Tianzhe Jiao Yuming Chen Zhe Zhang Chaopeng Guo Jie Song 
supported by the National Natural Science Foundation of China(GrantNo.62302086);the Natural Science Foundation of Liaoning Province(Grant No.2023-MSBA-070);the Fundamental Research Funds for the Central Universities(Grant No.N2317005).
Multi-modal 3D object detection has achieved remarkable progress,but it is often limited in practical industrial production because of its high cost and low efficiency.The multi-view camera-based method provides a fea...
关键词:3D object detection MULTI-MODAL knowledge distillation deep learning remote sensing 
检索报告 对象比较 聚类工具 使用帮助 返回顶部