检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:马伟 龚超凡 徐士彪[2] 张晓鹏[2] Ma Wei;Gong Chaofan;Xu Shibiao;Zhang Xiaopeng(Faculty of Information Technology,Beijing University of Technology,Beijing 100124;National Laboratory of Pattern Recognition,Institute of Automation,Chinese Academy of Sciences,Beijing 100190)
机构地区:[1]北京工业大学信息学部,北京100124 [2]中国科学院自动化研究所模式识别国家重点实验室,北京100190
出 处:《计算机辅助设计与图形学学报》2021年第6期855-863,共9页Journal of Computer-Aided Design & Computer Graphics
基 金:国家自然科学基金(61771026,61971418,61671451).
摘 要:针对现有方法所得物体轮廓位置欠准确、线条粗、乱等问题,提出自顶向下导引式逐层融合的物体轮廓检测网络.首先采用常用卷积神经网络作为主干网络提取不同尺度特征;鉴于低层特征中边缘位置准确但包含较多非轮廓噪声,而高层特征更有助于区分轮廓和非轮廓,自顶向下逐渐融合相邻尺度特征,借助高层特征强化轮廓边缘并抑制非轮廓噪声;最后提出改进的2分类交叉熵损失函数,训练网络生成物体轮廓.在PyTorch环境下,用公开数据集SBD测试所提出网络.量化和可视化实验结果表明,相比现有方法,该网络所得物体轮廓位置更准确、线条更细、更干净.Object contours detected by existing methods are generally inaccurate,thick and contaminated by inner edges.A top-down guided fusion network(TDGF-Net)is proposed for object contour detection.Firstly,TDGF-Net extracts multi-level features using popular convolutional neural network architectures.Secondly,considering that edges in lower levels of features are more precise but contain noisy edges,while higher levels of features are more helpful to discriminate contours,TDGF-Net gradually fuses the multi-level features in a top-down manner and uses the features from the higher levels to enhance contours and suppress noises in the lower levels.Finally,an improved version of the binary cross entropy loss is presented to train the proposed network.The proposed network is experimentally compared with state-of-the-art methods based on the public SBD dataset in PyTorch.Qualitative and quantitative results show that compared to state-of-the-art methods,our network generates more accurate,thinner and cleaner contour edges.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.220.69.92