An Efficient Optical Flow Based Motion Detection Method for Non-stationary Scenes
Huang,Junjie1,2; Zou,Wei1,2; Zhu,Zheng1,2; Zhu,Jiagang1,2
2019
会议名称第31届中国控制与决策会议(2019CCDC)
会议日期2019年6月3日-5日
会议地点中国南昌
摘要

Real-time motion detection in non-stationary scenes is a difficult task due to dynamic background, changing foreground appearance and limited computational resource. These challenges degrade the performance of the existing methods in practical applications. In this paper, an optical flow based framework is proposed to address this problem. By applying a novel strategy to utilize optical flow, we enable our method being free of model constructing, training or updating and can be performed efficiently. Besides, a dual judgment mechanism with adaptive intervals and adaptive thresholds is designed to heighten the system’s adaptation to different situations. In experiment part, we quantitatively and qualitatively validate the effectiveness and feasibility of our method with videos in various scene conditions. The experimental results show that our method adapts itself to different situations and outperforms the state-of-the-art realtime methods, indicating the advantages of our optical flow based method.

收录类别EI
语种英语
七大方向——子方向分类图像视频处理与分析
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/23601
专题中科院工业视觉智能装备工程实验室_精密感知与控制
通讯作者Huang,Junjie
作者单位1.Institute of Automation, Chinese Academy of Sciences
2.University of Chinese Academy of Sciences
第一作者单位中国科学院自动化研究所
通讯作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Huang,Junjie,Zou,Wei,Zhu,Zheng,et al. An Efficient Optical Flow Based Motion Detection Method for Non-stationary Scenes[C],2019.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
PID5823885.pdf(855KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Huang,Junjie]的文章
[Zou,Wei]的文章
[Zhu,Zheng]的文章
百度学术
百度学术中相似的文章
[Huang,Junjie]的文章
[Zou,Wei]的文章
[Zhu,Zheng]的文章
必应学术
必应学术中相似的文章
[Huang,Junjie]的文章
[Zou,Wei]的文章
[Zhu,Zheng]的文章
相关权益政策
暂无数据
收藏/分享
文件名: PID5823885.pdf
格式: Adobe PDF
此文件暂不支持浏览
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。