Vision-based hand gesture recognition, enabling computer to see and understand hand gestures as humans do, could be widely applied in areas such as Virtual Reality, Human Computer Interaction, Sign Language Recognition and Tele-robotics. In the context of an intelligent wheelchair, we conducted research on vision-based hand gesture recognition for human computer interaction. Hand representation, hand tracking, static hand posture recognition, and dynamic hand gesture recognition are discussed here. The main contributions of this thesis are summarized as follows: ①We proposed a novel tracking algorithm, the Mean Shift Embedded Particle Filter (MSEPF), to improve the efficiency of conventional particle filters. By embedding mean shift iteration in particle filter, the MSEPF leads to more efficient sampling, concentrating on particles with large weights; therefore, the degeneracy problem and sampling impoverishment problem of particle filters are circumvented by increasing the number of efficient samples. At the same time, the MSEPF does not need a large number of particles to maintain multiple modes of posterior density, hence save much computation cost. ②Real time hand tracking in intelligent wheelchair environment was achieved by using the MSEPF. A simple but effective dynamic model is utilized here due to particle shifting. We first adopted skin color to represent the hand, and the skin color model is adapted frame-by-frame for skin color could change due to varying illumination. In order to handle the skin-colored distractor in background, we proposed the observation model fusing color and motion cues. Mean shift iteration is also performed on skin color and motion cues. ③Based on the idea of orientation histogram, we proposed orientation his- togram of hand contour to represent hand static posture. After hand is localized in the image, hand contour is obtained by segmentation based on skin color. Orientation histogram of hand contour is computed to match with models learned from training set for final posture recognition. ④We presented the Temporal Template Based Trajectories (TTBT) by introducing spatio-temporal trajectory into temporal templates. TTBT collapse the tracked hand motion trajectory into static image. A two-layer classi- tiers is designed to recognize the predefined seven dynamic gestures, based on the statistical shape and motion orientation analysis of TTBT. TTBT have better separate ability than temporal templates for dynamic gestures. The recognition method is easy to implement and does not need complex training. ⑤By applying the above algorithms on the intelligent wheelchair, we designed and implemented a real time hand control interface, which is part of the multi-modal perceptual interface of the intelligent wheelchair. The human robot interface based on hand gesture recognition developed in this thesis works well in real world.
修改评论