检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:李善超 车国霖[1] 张果[1] 杨晓洪[1] LI Shan-chao;CHE Guo-lin;ZHANG Guo;YANG Xiao-hong(Faculty of Information Engineering and Automation,Kunming University of Science and Technology,Kunming 650500,China)
机构地区:[1]昆明理工大学信息工程与自动化学院,昆明650500
出 处:《小型微型计算机系统》2021年第2期381-386,共6页Journal of Chinese Computer Systems
基 金:国家重点研发计划项目(2017YFB0306405)资助;国家自然科学基金项目(61364008)资助.
摘 要:针对ViBe算法在动态背景下存在鬼影消除时间长、算法适应性差、前景检测噪声多的问题,本文提出一种基于ViBe算法框架的改进算法.该算法采用鬼影检测法标记第1帧中的鬼影区域,并向位于鬼影区域的背景模型中强制引入背景样本,从而快速抑制鬼影;在像素分类过程中,引入自适应分类阈值,解决全局阈值易受动态噪声干扰的问题;在背景模型更新中,根据像素分类的匹配值来动态决定更新因子,提高算法适应场景变化的能力.定性与定量的对比实验结果表明,本文算法相较于ViBe算法能够有效地检测动态背景下的运动目标,应用于河流漂浮物检测场景中也有较好的效果.Aiming at the problem that ViBe algorithm has long ghost elimination time,poor algorithm adaptability and high foreground detection noise in dynamic background,this paper proposes an improved algorithm based on ViBe algorithm framework.The algorithm uses the ghost detection method to mark the ghost region in the first frame,and forces the background sample into the background model in the ghost region to quickly suppress the ghost.In the pixel classification process,the adaptive classification threshold is introduced to solve the problem that the global threshold is susceptible by dynamic noise interference.In the background model update,the update factor is dynamically determined according to the matching number of the pixel classification to improve the algorithm's ability to adapt to scene changes.The comparison experimental results of qualitative and quantitative shows that the algorithm in this paper can effectively detect moving targets in dynamic background compared to the ViBe algorithm,and it also has a better effect in the detection of river floating objects.
关 键 词:ViBe 动态背景 运动目标检测 自适应方法 河流漂浮物检测
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.135.182.75