检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:沈嘉禾 袁冬莉[1] 杨征帆 闫建国[1] 肖冰 邢小军[1] SHEN Jiahe;YUAN Dongli;YANG Zhengfan;YAN Jianguo;XIAO Bing;XING Xiaojun(School of Automation,Northwestern Polytechnical University,Xi′an 710072,China)
机构地区:[1]西北工业大学自动化学院,陕西西安710072
出 处:《西北工业大学学报》2022年第4期787-795,共9页Journal of Northwestern Polytechnical University
基 金:陕西省自然科学基础研究计划(2020JM⁃123)资助。
摘 要:随着空中加油技术的发展,自主空中加油(autonomous aerial refueling,AAR)成为未来战场上的重要技术,是具有前瞻性和挑战性的前沿课题。受油机和锥套之间的位置关系对于AAR十分重要,故此提出一种基于神经网络的锥套图像识别方法。针对硬件要求,使用以C语言为基础的YOLO网络作为初始网络,使其符合机载操作系统VxWorks的要求,可直接在嵌入式系统上运行。针对锥套的物理特点,设计了多维度的anchor box,优化了网络结构以适应锥套的多尺寸情况。针对识别结果漂移的问题,参考金字塔结构使用了多种大小的特征图,优化了网络的损失函数。测试结果表明,经过优化设计的卷积神经网络模型在锥套图像数据集上能够更准确、更快速地识别所需目标。With the development of aerial refueling technology,autonomous aerial refueling(AAR)has become an important technology in the future battlefield,which is a promising prospective and challenging topic.Since the rel⁃ative position between the receiver and the drogue is important to accomplish the AAR task,a neural network⁃based image recognition method is proposed to acquire the required information.Firstly,a C language⁃based YOLO net⁃work is used as the initial network,which meets the requirements of the on⁃board VxWorks system and can be run directly on the hardware.Then,considering the physical characterizes of the drogue,a multi⁃dimensional anchor box is designed and the network structure is optimized to adapt to the multi⁃dimensional situations.Finally,to ad⁃dress the problem of results shifts,feature maps with various sizes and the optimized loss function are used to optimize the network,where the pyramid structure suggests the design of feature maps.The experimental results indicate that the presented method can recognize the drogue more accurately and quickly.
分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.30