检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:谭睿俊 赵志诚[1,2] 谢新林 张大珩 TAN Rui-jun;ZHAO Zhi-cheng;XIE Xin-lin;ZHANG Da-heng(School of Electronic Information Engineering,Taiyuan University of Science and Technology,Taiyuan 030024,China;Shanxi Key Laboratory of Advanced Control and Equipment Intelligence,Taiyuan 030024,China)
机构地区:[1]太原科技大学电子信息工程学院,太原030024 [2]先进控制与装备智能化山西省重点实验室,太原030024
出 处:《太原科技大学学报》2024年第1期26-31,共6页Journal of Taiyuan University of Science and Technology
基 金:山西省自然科学基金(201901D211304)。
摘 要:提出了一种基于可分离卷积残差网络的车辆场景图像语义分割算法。首先利用一系列的可分离残差网络块对图像进行更全面的小目标边缘特征提取;然后采用跳跃连接以及2倍反卷积对五个Layer模块的特征图进行上采样,得到分割结果;在训练的过程中,先训练图像各目标的轮廓,再训练目标的细节特征,整体提高图像分割的精度。实验所用的数据集为Camvid,实验结果表明:该算法的平均交并比较原全卷积网络相比,由76.85%提升至83.30%,对小目标的分割边界更加完整,有效地提高了分割精度。Aiming at the problems of low precision and incomplete segmentation edge of small targets by full convolutional network,a semantic segmentation algorithm of vehicle scene image based on separable convolutional residual network is proposed.Firstly,a series of separable residual network blocks are used to extract edge features of small targets.Then,skip connection and 2x deconvolution are used to up-sample the feature maps of five Layer modules,and the segmentation results are obtained.In the process of training,the contour of each target in the image is trained first,and then the detail features of the target are trained to improve the accuracy of image segmentation.The Camvid data set is used in the experiment.The experimental results show that the Mean intersection over union of the algorithm is improved from 76.85% to 83.30% compared with the original full convolutional network,and the segmentation boundary of small targets is more complete and the segmentation accuracy is improved effectively.
关 键 词:可分离残差网络 跳跃连接 车辆语义分割 Camvid
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222