检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:金芊芊 罗建 张晓倩 杨梅 李杨 JIN Qianqian;LUO Jian;ZHANG Xiaoqian;YANG Mei;LI Yang(School of Electronic Information Engineering,Xihua Normal University,Nanchong 637009,China)
机构地区:[1]西华师范大学电子信息工程学院,四川南充637009
出 处:《成都信息工程大学学报》2023年第6期673-680,共8页Journal of Chengdu University of Information Technology
摘 要:针对背景信息复杂、目标类别不均衡,遥感图像的中小目标在分割时常出现误检、漏检的问题,提出一种基于DeepLabV3p改进的遥感图像中小目标分割方法。采用ResNet101作为DeepLabV3p的骨干网络,提出多级感受野融合的ASPP模块,以获取更多感受野;添加SE注意力机制,使模型获得更加精准的通道信息;使用加权的CrossEntropyLoss和LovaszSoftmaxLoss损失函数进行训练,克服数据集目标不均衡的问题;使用全连接条件随机场对预测结果进行图像后处理,对模型输出进行精细化处理。实验结果表明,使用该方法对DLRSD数据集进行分割,mIOU可达到73.22%,与基础网络相比提高了3.78%,有效提高了遥感图像中小目标的分割精度和准确率。Due to the complexity of background information and the imbalance of target categories,small and medium-sized targets in remote sensing images are often subject to false detection and missing detection in segmentation.In order to solve this problem,an improved segmentation method for small and medium-sized targets of remote sensing images is proposed,which is based on DeepLabV3p.The ASPP module of multi-level receptive field fusion is proposed to obtain more receptive fields.In the decoding part,Adds SE attention mechanism to enable the model to obtain more accurate channel information.The weighted Cross Entropyloss function and LovaszSoftmaxLoss function are used for training.Finally,the CRFs is used for image post-processing of the prediction results,and the model output is refined.The experimental results show that using this method to segment images in DLRSD dataset,the mIOU can reach 73.22%,which is 3.78%higher than that of the basic network,effectively improves the segmentation precision and accuracy of small and medium targets in remote sensing images.
关 键 词:DeepLabV3p 遥感图像 SE注意力机制 ASPP CRFs全连接条件随机场 混合损失函数
分 类 号:TP75[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.148.109.137