检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:耿磊[1] 史瑞资 刘彦北[1,3] 肖志涛[1] 吴骏[2] 张芳[1] GENG Lei;SHI Rui-zi;LIU Yan-bei;XIAO Zhi-tao;WU Jun;ZHANG Fang(School of Life Sciences,Tiangong University,Tianjin 300387,China;School of Electronics and Information Engineering,Tiangong University,Tianjin 300387,China;Tianjin Key Laboratory of Optoelectronic Detection Technology and Systems,Tiangong University,Tianjin 300387,China)
机构地区:[1]天津工业大学生命科学学院,天津300387 [2]天津工业大学电子与信息工程学院,天津300387 [3]天津工业大学天津光电检测技术与系统重点实验室,天津300387
出 处:《计算机工程与设计》2022年第3期771-777,共7页Computer Engineering and Design
基 金:天津市自然科学基金项目(18JCYBJC15300)。
摘 要:为解决现有深度学习图像分割算法不能有效分割指针仪表图像中密集小目标的问题,提出基于多重感受野UNet的仪表图像分割方法。将自编码器结构和空洞卷积结构结合,使多尺度浅层特征和深层语义信息融合;以多种光照强度下采集的指针仪表数据训练模型,充分提升神经网络的泛化能力;并行调节空洞卷积参数,使神经网络学习到最优模型。实验结果表明,算法显著提升了指针仪表图像中密集小目标的分割效果,有效泛化于不同光照强度下采集的同种指针仪表图像,验证了该模型的有效性。To solve the problem that existing image segmentation algorithm cannot effectively segment small and dense targets in pointer instrument images,an instrument image segmentation method based on UNet with multi-scale receptive field was proposed.Auto encoder-decoder structure was combined with atrous convolution structure,which fused multi-scale shallow features and deep semantic information.The model was trained with pointer instrument dataset collected under various light intensities to fully improve the generalization ability of neural network.Parallel adjustment of the atrous convolution parameters was carried out,so that neural network learnt optimal model.Experimental results show that the proposed algorithm significantly improves segmentation effects of dense small targets in the pointer instrument image,and effectively generalizes the same pointer instrument images collected under different light intensities,which verifies the effectiveness of the model.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.104