CASIA OpenIR  > 毕业生  > 硕士学位论文
视觉基元提取与主动视觉自标定技术
汪威
Subtype工学硕士
Thesis Advisor马颂德
1997-06-01
Degree Grantor中国科学院自动化研究所
Place of Conferral中国科学院自动化研究所
Degree Discipline模式识别与智能系统
Abstract图象基元提取、主动视觉中的注视点轨迹控制以及摄像机自定标是计算机视 觉领域中三个十分重要的问题。 本文中作者对这三个问题的一些方面进行了较为深入的研究。该工作的主要 贡献可归纳为以下几个方面: 1.对Hough变换(以随机Hough变换为代表)和优化方法(以Tabu搜索方 法为代表)在基元提取方面的应用进行了比较和研究,建立了随机Hough变换和 随机优化方法的理论模型,证实了对于基元提取问题随机Hough变换的证据积累 策略比优化方法中反复的代价函数计算策略更为有效。同时,根据本文得出的结 论,我们对随机Hough变换和优化方法在应用于基元提取问题时的策略进行了有 效的改进。 2.系统地研究了扩展Hough变换[1](Extended Hough Transfoilll,EHT), 将其从单纯的直线提取推广到圆和椭圆的提取,并且在一定程度上推广至一般的 曲线。此外,我们还首次将EHT方法应用于主动视觉中的注视点轨迹控制问题, 成功地模拟r人类视觉系统中视锐度空间变化特性,从而能根据需要控制视野中 不同区域内视觉基元的提取精度。 3.在文献[2]工作的基础上,提出了一种基于主动视觉的摄像机自定标方法。 该方法不再要求摄像机平台能够在三维空间内做任意直线运动,而只需要平台能 够实现扫视(pan)、俯仰(tilt)和平面平移(translation)运动,从而避免r 文献[1]中的方法对平台十分苛刻的要求。
Other AbstractThe work can be summarized as the following three parts: 1. A comparison between the Randomized Hough transform(RHT) and the optimization based methods(OBM) in primitive extraction. The Hough transform has been a widely used tool for primitive extraction. However, recently, quite a number of the papers claimed that the optimization based methods could achieve superior results compared to the Hough transform in primitive extraction, for example, Tabu search approach [1], genetic algorithms[2]. In this work, we show that claimed superiority by the optimization methods are generally based on a comparison with the standard Hough transform(SHT), and the standard Hough transform is in essence incomparable with the optimization based methods. What are really comparable in the Hough family with the optimization based methods are the Randomized Hough transform, the dynamic Hough transform(DHT), and the Probabilistic Hough transform(PHT). Since the probabilistic Hough transform [3], the Dynamic Hough transform[4], and the randomized Hough transform[5][6] are conceptionally similar, we only use the randomized Hough transform (RHT) in this work to give a comparison for the two families. The basic difference between the RHT and the OBM is that the RHT relies on an evidence accumulation process in the parameter space which is in general implemented by an accumulator array. With randomly sampling of feature points sets, once a peak in the accumulator array overpasses a predefined threshold, a verification process begins to verify whether or not there is really a primitive in the image. However, the OBM is mainly based on a process called Cost Function Calculation. To wit, after each random sampling of feature points set, a cost function calculation process is activated to determine whether or not there really exists a primitive parameterized by the feature points set. (Note: The cost function calculation in the OBM is equivalent to the verification for the RHT). Thus in order to compare the RHT with the OBM, we should determine which process is more efficient between the evidence accumulation and frequently calculating cost function. In order to compare these two processes, the respective mathematical models were established for the two processes, and a reasonable criterion, namely average sampling number of feature points sets for a successful primitive extraction, was used to evaluate the two processes. It is shown that the RHT is much more efficient than the OBM. In addition, extensive simulations validated our theoretical conclusion.
shelfnumXWLW440
Other Identifier440
Language中文
Document Type学位论文
Identifierhttp://ir.ia.ac.cn/handle/173211/7202
Collection毕业生_硕士学位论文
Recommended Citation
GB/T 7714
汪威. 视觉基元提取与主动视觉自标定技术[D]. 中国科学院自动化研究所. 中国科学院自动化研究所,1997.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[汪威]'s Articles
Baidu academic
Similar articles in Baidu academic
[汪威]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[汪威]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.