A joint cascaded framework for simultaneous eye detection and eye state estimation
Gou, Chao1,3,4; Wu, Yue2; Wang, Kang2; Wang, Kunfeng1; Wang, Fei-Yue1,3; Ji, Qiang2
Source PublicationPATTERN RECOGNITION
2017-07-01
Volume67Issue:1Pages:23-31
SubtypeArticle
AbstractEye detection and eye state (close/open) estimation are important for a wide range of applications, including iris recognition, visual interaction and driver fatigue detection. Current work typically performs eye detection first, followed by eye state estimation by a separate classifier. Such an approach fails to capture the interactions between eye location and its state. In this paper, we propose a method for simultaneous eye detection and eye state estimation. Based on a cascade regression framework, our method iteratively estimates the location of the eye and the probability of the eye being occluded by eyelid. At each iteration of cascaded regression, image features from the eye center as well as contextual image features from eyelid and eye corners are jointly used to estimate the eye position and openness probability. Using the eye openness probability, the most likely eye state can be estimated. Since it requires large number of facial images with labeled eye related landmarks, we propose to combine the real and synthetic images for training. It further improves the performance by utilizing this learning-by-synthesis method. Evaluations of our method on benchmark databases such as BioID and Gi4E database as well as on real world driving videos demonstrate its superior performance comparing to state-of-the-art methods for both eye detection and eye state estimation. (C) 2017 Elsevier Ltd. All rights reserved.
KeywordEye Detection Eye State Estimation Learning-by-synthesis Cascade Regression Framework
WOS HeadingsScience & Technology ; Technology
DOI10.1016/j.patcog.2017.01.023
WOS KeywordPUPIL LOCALIZATION ; FEATURES ; ROBUST
Indexed BySCI
Language英语
Funding OrganizationUniversity of Chinese Academy of Sciences (UCAS) ; UCAS ; RPI ; National Science Foundation(1145152) ; National Natural Science Foundation of China(61304200 ; 61533019)
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Engineering, Electrical & Electronic
WOS IDWOS:000399520700003
Citation statistics
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/14484
Collection复杂系统管理与控制国家重点实验室_先进控制与自动化
Corresponding AuthorGou, Chao
Affiliation1.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
2.Rensselaer Polytech Inst, Dept Elect Comp & Syst Engn, Troy, NY 12180 USA
3.Qingdao Acad Intelligent Ind, Qingdao 266109, Peoples R China
4.Univ Chinese Acad Sci, Beijing 100049, Peoples R China
Recommended Citation
GB/T 7714
Gou, Chao,Wu, Yue,Wang, Kang,et al. A joint cascaded framework for simultaneous eye detection and eye state estimation[J]. PATTERN RECOGNITION,2017,67(1):23-31.
APA Gou, Chao,Wu, Yue,Wang, Kang,Wang, Kunfeng,Wang, Fei-Yue,&Ji, Qiang.(2017).A joint cascaded framework for simultaneous eye detection and eye state estimation.PATTERN RECOGNITION,67(1),23-31.
MLA Gou, Chao,et al."A joint cascaded framework for simultaneous eye detection and eye state estimation".PATTERN RECOGNITION 67.1(2017):23-31.
Files in This Item: Download All
File Name/Size DocType Version Access License
gouc_PR_2017_A joint(1337KB)期刊论文作者接受稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Gou, Chao]'s Articles
[Wu, Yue]'s Articles
[Wang, Kang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Gou, Chao]'s Articles
[Wu, Yue]'s Articles
[Wang, Kang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Gou, Chao]'s Articles
[Wu, Yue]'s Articles
[Wang, Kang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: gouc_PR_2017_A joint cascaded framework for simultaneous eye detection and eye state estimation.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.