检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:农元君 王俊杰[1] 陈红[1] 孙文涵 耿慧 李书悦 NONG Yuan-jun;WANG Jun-jie;CHEN Hong;SUN Wen-han;GENG Hui;LI Shu-yue(School of Engineering,Ocean University of China,Qingdao 266100,China)
出 处:《浙江大学学报(工学版)》2022年第2期236-244,共9页Journal of Zhejiang University:Engineering Science
基 金:山东省重点研发计划资助项目(2019GHY112081)。
摘 要:为了实现在光线不佳、夜间施工、远距离密集小目标等复杂施工场景下的图像描述,提出基于注意力机制和编码-解码架构的施工场景图像描述方法.采用卷积神经网络构建编码器,提取施工图像中丰富的视觉特征;利用长短时记忆网络搭建解码器,捕捉句子内部单词之间的语义特征,学习图像特征与单词语义特征之间的映射关系;引入注意力机制,关注显著性强的特征,抑制非显著性特征,减少噪声信息的干扰.为了验证所提方法的有效性,构建一个包含10种常见施工场景的图像描述数据集.实验结果表明,所提方法取得了较高的精度,在光线不佳、夜间施工、远距离密集小目标等复杂施工场景下具有良好的图像描述性能,且具有较强的泛化性和适应性.A construction scene image caption method based on attention mechanism and encoding-decoding architecture was proposed, in order to realize the image caption in the complex construction scenes such as poor light, night construction, long-distance dense small targets and so on. Convolutional neural network was used to construct encoder to extract rich visual features in construction images. Long short-term memory network was used to construct decoder to capture semantic features of words in sentences and learn mapping relationship between image features and semantic features of words. Attention mechanism was introduced to focus on significant features,suppress non-significant features and reduce interference of noise information. An image caption data set containing ten common construction scenes was constructed in order to verify the effectiveness of the proposed method.Experimental results show that the proposed method achieves high accuracy, has good image caption performance in complex construction scenes such as poor light, night construction, long-distance dense small targets and so on, and has strong generalization and adaptability.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222