检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:许又文 严心娥 郭亮 季日臣[4] XU You-wen;YAN Xin-e;GUO Liang;JI Ri-chen(Shaanxi University Innovation Research Institute of the Future Industry in Rail Transit,College of Civil Engineering,Xi'an Shaanxi 710300,China;Xi'an Traffic Engineering Institute,Xi'an Shaanxi 710300,China;Machinery Industry Investigation and Design Institute Co.,Ltd,Xi'an Shaanxi 710021,China;College of Civil Engineering,Lanzhou Jiaotong University,Lanzhou Lanzhou 730070,China)
机构地区:[1]陕西高校轨道交通未来产业创新研究院,陕西西安710300 [2]西安交通工程学院,陕西西安710300 [3]机械工业勘察设计研究院有限公司,陕西西安710021 [4]兰州交通大学土木工程学院,甘肃兰州730070
出 处:《计算机仿真》2024年第11期189-193,224,共6页Computer Simulation
基 金:陕西省教育厅科学研究计划项目资助(23JP087)。
摘 要:隐含狄利克雷分布(LDA)在文本数据挖掘、图像处理、生物信息处理等领域被广泛应用,但在地铁工程中应用较少。针对地铁施工安全事故分类精度和效率不高的问题,提出一种基于LDA主题模型的聚类算法,利用PYTHON3.6编程来实现安全事故案例数据预处理、建模、可视化、模型优化和聚类分析,融入面向地铁施工安全事故的分类词典,实现对工程安全事故的准确分类。通过获取以功能性能和接口等需求为导向的隐含主题,有效提高地铁施工安全事故分类的准确度。The Hidden Dirichlet Distribution(LDA)is widely used in fields such as text data mining,image pro-cessing,and bioinformatics processing,but its application in subway engineering is relatively limited.A clustering al-gorithm based on LDA topic model is proposed to address the problem of low accuracy and efficiency in classifying safety accidents in subway construction,PYTHON 3.6 programming is used to realize the visualization model optimization and cluster analysis of safety accident case data preprocessing modeling,and the classification dictionary for subway construction safety accidents is integrated to realize the accurate classification of engineering safety accidents The accuracy of subway construction safety accident classification can be effectively improved by obtaining the hidden topics oriented by the requirements of function performance and interface.
分 类 号:X951[环境科学与工程—安全科学]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.38