高阶Takagi-Sugeno-Kang模糊知识蒸馏分类器及其在脑电信号分类中的应用  

TSK fuzzy distillation classifier with negative Euclidean probability and High-order fuzzy dark knowledge transfer and its application on EEG signals classification

在线阅读下载全文

作  者:蒋云良[1,2,3] 印泽宗 张雄涛[1,2] 申情 李华 JIANG Yunliang;YIN Zezong;ZHANG Xiongtao;SHEN Qing;LI Hua(School of Information Engineering,Huzhou University,Huzhou 313000,China;Zhejiang Province Key Laboratory of Smart Management and Application of Modern Agricultural Resources,Huzhou 313000,China;School of Computer Science and Technology,Zhejiang Normal University,Jinhua 321004,China;College of Mathematical Medicine,Zhejiang Normal University,Jinhua 321004,China)

机构地区:[1]湖州师范学院信息工程学院,浙江湖州313000 [2]浙江省现代农业资源智慧管理与应用研究重点实验室,浙江湖州313000 [3]浙江师范大学计算机科学与技术学院,浙江金华321004 [4]浙江师范大学数理医学院,浙江金华321004

出  处:《智能系统学报》2024年第6期1419-1427,共9页CAAI Transactions on Intelligent Systems

基  金:国家自然科学基金项目(U22A20102,62376094);浙江省“尖兵”“领雁”研发攻关计划项目(2023C01150).

摘  要:在脑电信号(electro encephalo gram,EEG)的分类检测任务中,低阶TSK(Takagi-Sugeno-Kang)模糊分类器的学习速度较快,但性能表现不理想,高阶TSK模糊分类器虽然具有较强的性能优势,但极其复杂的模糊规则后件严重影响模型的运行速度。为此,提出一种基于负欧氏概率和高阶模糊隐藏知识迁移的新型TSK模糊蒸馏分类器(solved TSK-least learning machine-knowledge distillation classifier,STSK-LLM-KD)。首先,利用所提出的基于知识蒸馏的最小学习机(LLM-KD)对教师模型的后件参数进行快速求解并得到相应的负欧氏概率用于生成软标签;然后,通过计算软标签之间的Kullback-Leible散度提取教师模型的高阶模糊隐藏知识并迁移至低阶学生模型中,使模型性能优于高阶TSK模糊分类器的同时保持更快的训练速度。在运动想象脑电数据集和新德里HauzKhas癫痫脑电数据集上的实验结果充分验证了STSK-LLM-KD的优势,STSK-LLM-KD相较于其他模糊分类器表现更加优异,与深度知识蒸馏模型相比,STSK-LLM-KD能够更好地提升学生模型的性能。In the classification and detection task of electroencephalogram(EEG)signals,the low-order Takagi-Sugeno-Kang(TSK)fuzzy classifier runs faster but performs poorly,while the high-order TSK fuzzy classifier demonstrates strong prediction performance.However,the extremely complex fuzzy rules in the consequent part notably affect the running speed of the model.Therefore,this study proposes a novel TSK fuzzy distillation classifier,STSK-LLM-KD,based on negative Euclidean probability and high-order fuzzy dark knowledge transfer.First,the least learning machine based on knowledge distillation(LLM-KD)is used to quickly solve the consequent parameters of the teacher model and obtain corresponding negative Euclidean probabilities to generate soft labels.Then,the high-order fuzzy dark knowledge of the teacher model is extracted by calculating the Kullback-Leible divergence between soft labels and transferred to the low-order student model.This approach enhances the performance of the model beyond that of the high-order TSK fuzzy classifier while maintaining a faster training speed.Experimental results on the motor imagery EEG dataset and Hauz Khas epilepsy EEG dataset in New Delhi fully verify the advantages of the proposed STSK-LLM-KD.Compared to other fuzzy classifiers,STSK-LLM-KD performs better;compared to deep knowledge distillation models,STSK-LLM-KD more effectively improves the performance of the student model.

关 键 词:TSK模糊分类器 知识蒸馏 高阶模糊隐藏知识 脑电信号 最小学习机 癫痫 运动想象 模糊系统 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象