|Place of Conferral||北京|
|Keyword||工业机器人 工业零件 自动装配 三维位姿检测与跟踪|
|Other Abstract||Introducing robotic technology to industrial production can greatly reduce the work intensity of workers, and improve the production efficiency and product quality. Robotic technology has been more and more widely used in auto industry, mechanical and electrical industry, general machinery industry, casting industry and other industrial fields. With the development of industrial automation level, higher requirements for the function of the robot is also put forward, among which the intelligent level of the robot is an important content. The introduction of visual system into the robot system can greatly improve the robot’s ability to perceive the environment and the robot’s intelligent level. It has important application prospects in the fields of sorting, assembly and transportation of industrial parts. Acquisition of the pose and position information of industrial parts, i.e., industrial parts detection and localization, is a core technology in the application of robot vision system and one of the difficulties, and it is also one of the bottlenecks that restrict the promotion of robot in the customization and personalization in industrial production field.|
New challenges arise for pose measuring and localization of industrial parts, because of the noise existed in the industrial environment, the diversity of types of industrial parts, and the occlusion and projection deformation of the work pieces when stacked together. The existing technologies for pose and position estimation of the work piece have some common problems, such as low recognition rate under cluttered environment, being unable to realize pose estimation and tracking in real time, high pose estimation error rate in some special pose situations, etc. Efforts on analytical methods and application researches are made to solve the above mentioned practical problems. Main work and contributions of the dissertation are summarized as follows:
(1) In industrial applications, most of the work pieces have no texture and are of single color, and their feature informations are very limited. This makes detection errors easy to happen. To solve this problem, a pose estimation algorithm for work pieces based on off-line model construction and on-line hierarchical search is proposed. 3D work pieces grasp and manipulation is realized by introducing the vision system into the robot grasping system. The algorithm takes the 3D CAD model of the work piece as input, and a hierarchical template library is generated offline combining the internal parameters of the camera. Hierarchical searching strategy is conducted online to the images acquired by the camera, and precise pose of the recognized object is obtained through iterative optimization algorithm.
(2) For the problem that the detection error rate will increase when measuring the pose of the work piece using only edge features in clutter background. A method combining multi features to significantly improve the accuracy and efficiency of pose measuring is proposed. This method takes the CAD model of the work piece, internal parameters of the camera and a series of work scene images captured by the camera as input, and a 2D view library is generated, also a salient features library is generated, which contains the color information of the scene and texture information of the work piece. In the online pose measuring phase of the work piece, the accuracy and efficiency of image matching are improved by using the salient feature library.
(3) For the problem of low pose estimation speed when using view based pose estimation method, which makes it difficult to realize real time pose estimation and tracking for the work pieces, a method for work piece pose estimation and tracking based on dynamic model libraries is proposed. The proposed method mainly consists three processes, i.e., the offline static global library generation process, the online dynamic local library updating and selection process, and the 3D work-piece localization process. The dynamic local library updating and selection is realized by pose estimation of the work pieces, and this greatly reduces the searching space, thus improves the pose measuring speed for the work piece. Monocular vision based real time pose measuring for the 3D work pieces is realized.
(4) A crank shaft bearing assembly system based on vision guide and environmental attraction field is designed and constructed to solve the problems of how realize close loop control for the localization-grasping-guidance-insertion during the assembly process by introducing the mechanical constraint information of the object to be assembled into the assembly to reach the complimentary advantage with the pose and position information gained by the vision sensor. High precision assembly is realized by using the proposed strategy with only the vision information. In the vision guide process, a pose measuring method based on preanalysis and accuracy estimation model is proposed to solve the problem of detection error caused by inappropriate observation pose. A pose measuring accuracy estimation model is constructed by preanalysis, then it is used to calculate the accuracy of the pose measuring result. For low accuracy pose measuring situation, the pose of the camera will be changed by changing the pose of the industrial robot, and thus raise the accuracy of the pose measuring. Using the proposed crank shaft bearing assembly strategy based on the environmental attraction field system, high precision assembly of the crank shaft and bearing is realized.
A platform of industrial robot with vision system is built. Experiments on pose measuring and tracking with several work pieces are conducted, and the crank shaft and bearing is successfully assembled by the guidance of the vision system.
|朱文俊. 工业零件三维位姿检测、跟踪与装配方法研究[D]. 北京. 中国科学院研究生院,2017.|
|Files in This Item:|
|Recommend this item|
|Export to Endnote|
|Similar articles in Google Scholar|
|Similar articles in Baidu academic|
|Similar articles in Bing Scholar|
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.