检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王希良 王润琪 渠尊昊 WANG Xiliang;WANG Runqi;QU Zunhao(School of Transportation,Shijiazhuang Tiedao University,Shijiazhuang 050043,China;School of Civil Engineering,Shijiazhuang Tiedao University,Shijiazhuang 050043,China)
机构地区:[1]石家庄铁道大学交通运输学院,河北石家庄050043 [2]石家庄铁道大学土木工程学院,河北石家庄050043
出 处:《石家庄铁道大学学报(自然科学版)》2024年第3期69-74,81,共7页Journal of Shijiazhuang Tiedao University(Natural Science Edition)
摘 要:裂缝类路面损坏是城市道路中常见的病害,能否在众多路面信息图像中高效获取路面裂缝信息,是当前领域研究的关键点.为得到更高精度的图像识别分析,结合深度学习领域,基于Res2Unet-CBAM网络模型进行路面裂缝图像分割工作,分析了公共数据集的实验结果,并对比了UGNet,Res2Net等多种语义分割模型,在此基础上设计了Res2-Unet多尺度路面裂缝分割网络模型.通过对常见通道注意力模块进行分析,将CBAM通道注意力模块引入Res2-Unet分割模型,构建了一种新的Res2Unet-CBAM网络模型用于路面裂缝图像的分割任务.将Res2Unet-CBAM模型与其他深度学习模型的实验结果进行对比,结果表明此模型具有更好的图像分割效果.Crack pavement damage is a common problem in urban roads,the efficient acquisition ol pavement crack information in numerous pavement information images is a key point of current research in the field,In order to achieve higher accuracy in image recognition analysis,this paper combined the field of deep learning and carried out the pavement crack image segmentation work based on the Res2Unet-CBAM network model,analyzed the experimental results of the public dataset and compared various semantic segmentation models such as U-Net,Res2Net,and so on,Based on this,a Res2-Unet multi-scale pavement crack segmentation network model was designed,A new Res2Unet-CBAM net.work model was constructed for the segmentation task of pavement crack images by analyzing the com-mon channel attention module and introducing the CBAM channel attention module into the Res2-Unet segmentation model.Comparing the experimental results of Res2Unet-CBAM model with those of other deep learning models,the results show that this model has better image segmentation performance.
分 类 号:U41[交通运输工程—道路与铁道工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.207