检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陈科 张文剑 蔡凌曦 CHEN Ke;ZHANG Wenjian;CAI Lingxi(School of Economics and Management,Chengdu Technological University,Chengdu 611730,China)
机构地区:[1]成都工业学院经济与管理学院,成都611730
出 处:《成都工业学院学报》2022年第2期37-41,共5页Journal of Chengdu Technological University
基 金:四川省无人机产业发展研究中心一般项目(SCUAV20-B002)。
摘 要:开发一种基于人工智能的无人机校园安全预接警系统平台,用于解决传统人力方式在巡视范围、时效有局限,救援处理有延时的状况。以无人机飞行器为硬件载体,采用Python语言+MySQL数据库的开发模式,对地理信息系统(geographical information system,GIS)、图像识别并提取标志点算法、音频提取并转换文字算法、卷积神经网络算法、朴素贝叶斯分类法等多种技术进行系统整合,实现无死角巡逻、固定场景主动预警、语音被动接警、人工接警、数据管理5大模块。该系统利用无人机飞行器能不间断无死角巡视,快速抵达险情地点,提前判断险情类型,为后续救援提供精准信息,并提供药品、绳索等初步自救工具,提高救援成功率。An Artificial Intelligence-based pre-alarm and receiving alarm system platform for campus security UAV was developed,which is used to solve the situation that the traditional manpower method has limitations in patrol scope,time limitation and delayed rescue situation.UAV was taken as hardware carrier and the Python language and MySQL database was used as development mode to integrate various technologies such as the study Geographical Information System(GIS),image recognition and marking point extraction algorithm,audio extraction and text conversion algorithm,convolutional neural network algorithm,naive Bayes classification and other technologies.This system has five modules:non-dead-angel patrol,fixed scene active warning,voice passive alarm,manual alarm,and data management.The system can make use of UAV aircraft to patrol continuously without dead angel,and can quickly arrive at the dangerous place,judge the type of dangerous situation in advance,provide accurate information for the follow-up rescue,and provide medicine,rope and other preliminary self-rescue tools to improve the success rate of rescue.
分 类 号:V279[航空宇航科学与技术—飞行器设计] TP18[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28