OCT图像多教师知识蒸馏超分辨率重建  

Super-resolution reconstruction of retinal OCT image using multi-teacher knowledge distillation network

在线阅读下载全文

作  者:陈明惠[1] 芦焱琦 杨文逸 王援柱 邵怡[3] Chen Minghui;Lu Yanqi;Yang Wenyi;Wang Yuanzhu;Shao Yi(Shanghai Engineering Research Center of Interventional Medical,Shanghai Institute for Interventional Medical Devices,School of Health Sciences and Engineering,University of Shanghai for Science and Technology,Shanghai 200093,China;Shanghai Raykeen Laser Technology Co.,Ltd.,Shanghai 200120,China;Shanghai General Hospital,Shanghai 200080,China)

机构地区:[1]上海理工大学健康科学与工程学院,上海介入医疗器械工程技术研究中心,教育部医学光学工程中心,上海200093 [2]上海瑞柯恩激光技术有限公司,上海200120 [3]上海市第一人民医院,上海200080

出  处:《光电工程》2024年第7期95-106,共12页Opto-Electronic Engineering

基  金:上海市科委产学研医项目(15DZ1940400)。

摘  要:光学相干断层成像(OCT)广泛应用于眼科诊断与辅助治疗,但其成像质量不可避免地受到散斑噪声和运动伪影影响。本文提出了一种针对OCT超分辨率任务的多教师知识蒸馏网络MK-OCT,使用不同优势的教师网络训练平衡、轻量级和高效的学生网络。MK-OCT中高效通道蒸馏方法ECD的使用也使得模型能够更好地保留视网膜图像的纹理信息,满足临床需要。实验结果表明,与经典超分辨率网络相比,本文所提模型在重建精度和感知质量两个方面均表现优异,模型尺寸更小,计算量更少。Optical coherence tomography(OCT)is widely used in ophthalmic diagnosis and adjuvant therapy,but its imaging quality is inevitably affected by speckle noise and motion artifacts.This article proposes a multi teacher knowledge distillation network MK-OCT for OCT super-resolution tasks,which uses teacher networks with different advantages to train balanced,lightweight,and efficient student networks.The use of efficient channel distillation method ECD in MK-OCT also enables the model to better preserve the texture information of retinal images,meeting clinical needs.The experimental results show that compared with classical super-resolution networks,the model proposed in this paper performs well in both reconstruction accuracy and perceptual quality,with smaller model size and less computational complexity.

关 键 词:医学图像 光学相干断层图像 超分辨率 知识蒸馏 对比学习 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象