检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陈宇[1] 徐仕豹 CHEN Yu;XU Shibao(College of Information and Computer Engineering,Northeast Forestry University,Harbin 150040,China)
机构地区:[1]东北林业大学信息与计算机工程学院,哈尔滨150040
出 处:《哈尔滨理工大学学报》2024年第1期87-95,共9页Journal of Harbin University of Science and Technology
基 金:国家自然科学基金(62172087);中央高校基本科研业务费专项资金(2572021BH01).
摘 要:针对糖尿病视网膜病变(DR)检测模型在下采样过程中关键信息丢失和模型鲁棒性差的问题,构建一个PM-Net模型(Parallel Multi-scale Network)。在下采样过程中,利用信息增强的方式设计了多尺度最大池化和多尺度卷积模块并对ResNet-50改进。进一步,为了提高模型的鲁棒性,使用双分支的架构对模型进行扩展。提出的多尺度模块使得模型在下采样的过程中获得了更加丰富的视网膜眼底图像特征,从而提高了DR检测的性能,同时提出的双分支模型在DR检测过程中用局部信息辅助全局信息保证了模型的鲁棒性。模型在EyePACS、DDR和私有数据集进行了实验验证。实验结果表明:与主流的模型相比,本模型在EyePACS数据集上的准确率和二次加权Kappa分数分别提高了2.58%和1.31%。A PM-Net model(Parallel Multi-scale Network)has been constructed to solve the problems of loss of key information and poor model robustness in the downsampling process of diabetic retinopathy(DR)detection models.Multi-scale maximum pooling and multi-scale convolution modules have been designed and improved on ResNet-50 using information augmentation in the downsampling process.In addition,to improve the robustness of the model,the model was extended using a two-branch architecture.The proposed multi-scale module allows the model to obtain richer retinal fundus image features during downsampling,thus improving DR detection performance,while the proposed two-branch model ensures the robustness of the model with local information supplementing global information during DR detection.The model was experimentally validated on EyePACS,DDR,and private datasets.Experimental results show that the model's accuracy and quadratic weighted kappa score on the EyePACS dataset are improved by 2.58%and 1.31%respectively compared to mainstream models.
关 键 词:糖尿病视网膜病变 多尺度 并行网络 最大池化 ResNet-50
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.221.133.22