检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:朱秋生 刘迎[1] ZHU Qiu-sheng;LIU Ying(Key Laboratory of Optoelectronic Information Technology Science,School of Science,Tianjin University,Tianjin 300072)
机构地区:[1]天津大学理学院光电信息技术科学教育部重点实验室,天津300072
出 处:《光子学报》2020年第8期89-96,共8页Acta Photonica Sinica
基 金:国家自然科学基金(No.60278004)。
摘 要:提出了一种利用亚扩散空间分辨漫反射预测生物组织约化散射系数μs’和相函数参量γ的人工神经网络方法.采用蒙特卡罗方法得到光子经生物组织漫反射的数据样本,利用这些数据样本训练反向传播神经网络,用于从亚扩散散射光中预测γ的信息.为了解决同时预测μs’和γ两个参数时会产生较大误差的问题,分段数据训练两个BP网络,依次识别μs’和γ.研究发现3.64lth(lth表示平均输运自由程)是对γ的不敏感点,可用该点附近的数据样本训练网络用于预测μs’,用2lth范围内的数据样本训练网络用于预测γ.蒙特卡罗仿真结果表明,在1.3≤γ≤1.9范围内,预测结果与真实值的相对均方根误差在1%以内.与现有的测量方法相比,所提的人工神经网络方法更加简单,且提高了预测精度.An artificial neural network method is proposed for estimating reduced scattering coefficientμs’and phase function parameterγof biological tissues from spatially resolved reflectance profiles in the subdiffusive regime.Monte Carlo simulation method is used to obtain data samples of diffuse reflection from biological tissues.These data samples are used to train back-propagation neural network get the information ofγpredicted from the sub-diffused scattered light.Since there is a large error occurs when predictingμs’andγsimultaneously,the segmenting data train of two back-propagation networks is performed to identify theμs’andγin turn.It is found that 3.64 lth(lthrepresenting the average transport free path)is an insensitive points ofγ.The network trained with data samples near this point is used for predictingμs’,while the network trained with data samples in the 2 lth is used for predictingγ.Monte Carlo simulation result show that within the range 1.3≤γ≤1.9,the relative root mean square error between the predicted result and the true value is within 1%.Compared with the existing measurement methods,the proposed method is simpler and has improved accuracy.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.49