检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:郭梓凡
机构地区:[1]沈阳建筑大学,计算机科学与工程学院,辽宁 沈阳
出 处:《软件工程与应用》2023年第2期336-344,共9页Software Engineering and Applications
摘 要:野生蝴蝶品种繁多、分布广泛,且其对于生存环境的变化较为敏感,监测范围内的蝴蝶种群生存情况,可以为监测衡量该地区生态环境变化和衡量生态环境的质量提供帮助。现有的蝴蝶种类数据集大多数据量较小,识别的准确率较低。针对该问题,本文提出了基于VGG16模型的迁移学习及微调方法,对蝴蝶图像进行种类识别,以提高识别准确率。首先对蝴蝶种类数据集进行数据增强,再利用大型图像数据集对VGG16模型进行预训练,对预训练模型进行参数迁移,对卷积层和池化层进行“冻结”,修改全连接层和分类层,再“解冻”部分卷积层对参数进行微调,得到识别结果。实验证明:通过迁移学习加微调的方法,该网络对于蝴蝶种类识别的准确率得到有效提高,识别准确率从最初的76.67%,在迁移及微调后达到83.95%。There are many species and wide distribution of wild butterflies, and they are sensitive to changes in the living environment. The survival of butterfly populations within the monitoring range can provide help for monitoring and measuring changes in the ecological environment in the region and measuring the quality of the ecological environment. Most of the existing butterfly species data sets have small data volume and low recognition accuracy. In order to solve this problem, this paper proposes a migration learning and fine-tuning method based on the VGG16 model to recognize the butterfly image in order to improve the recognition accuracy. First, the butterfly species data set is enhanced, and then the large image data set is used to pre-train the VGG16 model, migrate the parameters of the pre-training model, “freeze” the convolution layer and pool layer, modify the full connection layer and classification layer, and “thaw” part of the convolution layer to fine-tune the parameters to get the recognition results. The experiment shows that the accuracy of the network for butterfly species recognition has been effectively improved by the method of migration learning and fine tuning, and the recognition accuracy has reached 83.95% after migration and fine tuning from 76.67% at the beginning.
分 类 号:TP3[自动化与计算机技术—计算机科学与技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222