检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:熊红林[1] 樊重俊[1] 赵珊 余莹 XIONG Honglin;FAN Chongjun;ZHAO Shan;YU Ying(Business School,University of Shanghai for Science and Technology,Shanghai 200093,China;IBM China Shanghai Branch,Shanghai 200002,China)
机构地区:[1]上海理工大学管理学院,上海200093 [2]IBM中国上海分公司,上海200002
出 处:《计算机集成制造系统》2020年第4期900-909,共10页Computer Integrated Manufacturing Systems
基 金:国家自然科学基金资助项目(71774111);上海市教育委员会科研创新重点基金资助项目(14ZZ131)。
摘 要:卷积神经网络在图像处理中的应用越来越广泛,针对图像处理技术手段在玻璃生产表面缺陷有效检验,分析了基于卷积神经网络的机器学习原理与方法,提出一种基于多尺度卷积神经网络(MCNN)图像识别模型,将MCNN模型在玻璃表面缺陷识别中进行应用实践研究,通过采用不同的算法模型和分类器进行对比实验,并运用混淆矩阵和F1值来评估学习器性能。实验结果表明,所设计的MCNN均比传统卷积神经网络(CNN)识别方法的准确率较高,尤其是在划痕缺陷和杂质缺陷图像的识别准确率上提高了较大的幅度,F1值均提高了5.0%以上,在玻璃缺陷检测的整体识别准确率上较优。Convolutional neural network is widely used in image processing. In order to effectively inspect glass surface defects in production activities, the principle of machine learning based on convolutional neural network was analyzed. An image recognition model based on Multiscale Convolution Neural Network(MCNN) was proposed. Then, the application of MCNN model in the identification of glass surface defects was studied, and comparison experiments were carried out by using different algorithms and classifiers. Furthermore, confusion matrix and F1 values to evaluate learner performance were used to evaluate the performance of learner. Experiment results showed that the designed MCNN was more accurate than the traditional Convolutional Neural Networks(CNN) recognition method, especially in the recognition accuracy of scratch defects and impurity defect images, F1 values were increased by more than 5.0%. Obviously, by comparing with the traditional CNN, MCNN is superior in the overall recognition accuracy of glass defect detection.
关 键 词:卷积神经网络 机器学习 Softmax回归 支持向量机 玻璃缺陷检测
分 类 号:TP398.1[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222