检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王俊杰[1] 农元君 张立特 翟佩臣 WANG Jun-jie;NONG Yuan-jun;ZHANG Li-te;ZHAI Pei-chen(School of Engineering,Ocean University of China,Qingdao 266100,China)
出 处:《吉林大学学报(工学版)》2023年第1期226-233,共8页Journal of Jilin University:Engineering and Technology Edition
基 金:山东省重点研发计划项目(2019GHY112081).
摘 要:鉴于施工现场中工人与施工机械及施工用具之间不合规的交互关系是引发安全事故的重要原因,提出了一种基于施工场景的视觉关系检测方法。首先,采用卷积神经网络搭建实体检测和关系检测分支,以提取出施工场景中的实体特征和关系特征;其次,构建视觉模块、语义模块和空间模块对提取出的特征进行学习,使网络充分感知和理解视觉信息、语义信息与空间信息;最后,设计了一种图形对比损失函数,以提高模型的视觉关系检测性能。在自制的施工场景关系检测数据集上的实验结果表明,本文方法实现了75.89%、77.64%、78.93%的R@20、R@50、R@100召回率,具有良好的视觉关系检测性能,能精准地检测出施工场景中的目标及其交互关系。The non-compliant interaction between workers,construction machinery and construction appliances in the construction site is an important cause of safety accidents.Therefore,a visual relationship detection method based on construction scene is proposed.Firstly,convolution neural network is used to build entity detection and relationship detection branches to extract entity features and relationship features in construction scene.Secondly,visual module,semantic module and space module are constructed to learn the extracted features,so that the network can fully perceive and understand visual information,semantic information and spatial information.Finally,a graphical contrastive loss function is designed to improve the visual relationship detection performance.The experimental results on the self-made construction relationship detection data set show that the proposed method achieves the R@20,R@50,R@100 recall rate of 75.89%,77.64%and 78.93%.The proposed method has good visual relationship detection performance,and can accurately detect the objects and their interactions in the construction scene.
关 键 词:计算机应用技术 视觉关系检测 施工场景 卷积神经网 场景图 图像理解
分 类 号:TP319.4[自动化与计算机技术—计算机软件与理论]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.148.106.159