检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Rui HUANG Bokun DUAN Yuxiang ZHANG Wei FAN
机构地区:[1]School of Computer Science and Technology,Civil Aviation University of China,Tianjin 300300,China
出 处:《Chinese Journal of Aeronautics》2022年第10期222-232,共11页中国航空学报(英文版)
基 金:Natural Science Foundation of Tianjin,China(No.20JCQNJC00720)。
摘 要:Deep learning-based methods have achieved remarkable success in object detection,but this success requires the availability of a large number of training images.Collecting sufficient training images is difficult in detecting damages of airplane engines.Directly augmenting images by rotation,flipping,and random cropping cannot further improve the generalization ability of existing deep models.We propose an interactive augmentation method for airplane engine damage images using a prior-guided GAN to augment training images.Our method can generate many types of damages on arbitrary image regions according to the strokes of users.The proposed model consists of a prior network and a GAN.The Prior network generates a shape prior vector,which is used to encode the information of user strokes.The GAN takes the shape prior vector and random noise vectors to generate candidate damages.Final damages are pasted on the given positions of background images with an improved Poisson fusion.We compare the proposed method with traditional data augmentation methods by training airplane engine damage detectors with state-ofthe-art object detectors,namely,Mask R-CNN,SSD,and YOLO v5.Experimental results show that training with images generated by our proposed data augmentation method achieves a better detection performance than that by traditional data augmentation methods.
关 键 词:Airplane engine Damage detection Data augmentation GAN INTERACTIVE
分 类 号:V263.6[航空宇航科学与技术—航空宇航制造工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.147.8.67