检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:欧文娟[1] 孟耀勇[1] 张小燕[1] 孔猛[1]
机构地区:[1]华南师范大学生物光子学研究院,广州510631
出 处:《分析化学》2011年第7期1104-1108,共5页Chinese Journal of Analytical Chemistry
基 金:国家自然科学基金(No.60411130595);广东省中医药局项目(No.2008233)资助
摘 要:研究紫外-可见吸收光谱技术结合化学计量学方法鉴别真假蜂蜜。根据蜂蜜中果糖和葡萄糖的典型质量比1.2:1.0,配制与真蜂蜜相近的掺假溶液,并以5%~20%的比例掺入真蜂蜜中。获取纯正蜂蜜和掺假蜂蜜的紫外-可见吸收光谱,选择最佳敏感波段250~400 nm的吸光度值进行主成分分析(PCA),优选主成分作为反向传播人工神经网络(BPANN)的输入向量。输出结果显示,校准集和预测集的准确鉴别率均为100%;对应的均方根误差分别为8.523×10-3和8.961×10-3。研究结果表明,基于PCA-BPANN的紫外-可见吸收光谱技术能够方便、快速、准确地鉴别真假蜂蜜,为食品质量的快速检测提供可靠参考。UV-visible(UV-vis) absorption spectroscopy in combination with chemometrics was used to identify the authentic and adulterated honeys.Adulterant solution prepared using D-fructose and D-glucose following the mass ratio of typical of honey composition(1.2∶1.0) was close to the real honey and added to individual honeys at levels of 5%,10%,15% and 20%.Absorption spectra of authentic and adulterated honeys in the wavelength range of 220-750 nm were acquired.The absorbance values of the best-sensitive band(250-400 nm) were selected to build models.The optimal identification model was developed with principal component analysis in combination with back propagation artificial neural network(PCA-BP-ANN).The scores of optimal principal components were used as the input vectors of model.The output results showed that the correct identification rates were 100% for both the calibration and prediction sets and the corresponding root-mean-square errors were 8.523×10-3(RMSEC) and 8.961×10-3(RMSEP),respectively.The study demonstrates that UV-vis absorption spectroscopy based on PCA and BP-ANN can be used as a convenient,rapid and accurate technique for identification of authentic and adulterated honeys
关 键 词:蜂蜜 掺假 紫外-可见吸收光谱 反向传播人工神经网络 主成分-反向传播人工神经网络
分 类 号:S896.1[农业科学—特种经济动物饲养]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.63