检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:卢鹏 赵亚琴[1] 陈越 孙一超 徐媛 LU Peng;ZHAO Yaqin;CHEN Yue;SUN Yichao;XU Yuan(College of Mechanical and Electronic Engineering, Nanjing Forestry University, Nanjing 230031, China)
机构地区:[1]南京林业大学机械电子工程学院,南京210031
出 处:《火灾科学》2020年第3期142-149,共8页Fire Safety Science
基 金:国家自然科学基金青年科学基金项目(31200496)。
摘 要:针对现有的火灾火焰图像识别方法在光照和红花等类似火焰干扰的复杂环境下存在错检和漏检的问题,提出一种基于SSD_MobileNet的复杂环境火焰区域标记方法。首先,将深度卷积神经网络SSD300的基础卷积网络VGG16替换为MobileNet网络,应用深度可分离卷积,降低网络参数,进而构建一种火焰图像检测的SSD_MobileNet模型;然后,迁移第一次训练模型所有的卷积层参数,初始化新的待训练模型;最后,加入新的数据样本用于削弱光照、红花等干扰对象的影响。通过与SSD300、以及深度学习的目标检测算法Faster R-CNN和YOLOv3-tiny对比,实验结果表明,构建的火焰检测和火焰区域标记SSD_MobileNet模型的综合性能优于Faster R-CNN和YOLOv3-tiny模型,更适用于实时火焰检测领域。Aiming at the problem of false detection and missed detection of existing fire image recognition methods in the complex environment of light,red flower and other Flame-like interference,this paper proposes a method based on SSD_MobileNet for detecting the flame region in the complex environment.First,the basic convolutional network VGG16 of SSD300 was replaced by the MobileNet network using depthwise separable convolutions to reduce network parameters.A novel flame region detection model based on the improved SSD_MobileNet was constructed.All the convolution layer parameters of the first training model were then migrated to initialize the new model to be trained.Finally,the new data samples were added to reduce the effects of disturbing objects such as light and red flower.The experimental results showed that the comprehensive performance of the flame region detection model based on SSD_MobileNet constructed in this paper is better than other algorithms including SSD300,Faster R-CNN and YOLOv3-tiny,and it is more suitable for real-time flame detection.
关 键 词:火焰识别 复杂环境 深度学习 SSD_MobileNet
分 类 号:X915.5[环境科学与工程—安全科学]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.112