CASIA OpenIR  > 毕业生  > 博士学位论文
Alternative TitleUnsupervised Hyperspectral Unmixing Methods
Thesis Advisor潘春洪 ; 王颖
Degree Grantor中国科学院大学
Place of Conferral中国科学院自动化研究所
Degree Discipline模式识别与智能系统
Keyword无监督高光谱解混 结构稀疏学习 数据引导稀疏学习 鲁棒学习 大规模子集选择 增广拉格朗日方法 Unsupervised Hyperspectral Unmixing (Uhu) Structured Sparse Learning Data Guided Sparse Learning Robust Learning Large Scale Sbuset Selection Augmented Lagrangian Method (Alm)
Abstract无监督高光谱解混是高光谱遥感领域的研究热点之一,是高光谱遥感定量化发展的重要工具;同时也是众多高光谱应用的基础,例如高光谱图像理解和可视化、高光谱图像压缩和重建、精确的地物检测和识别、高光谱图像增强、亚像素映射以及高空间分辨率的高光谱成像等。然而该问题仍具挑战性,主要源于如下三个方面的难点:(1) 由于端元和丰度均未知,解混模型的解空间较大,算法容易陷入次优解;(2) 高光谱图像易受噪声干扰,产生较多外点通道,影响解混精度;(3) 在有关地物目标先验未知时,端元是一种主观的概念,缺乏明确定义。本文将针对上述诸问题,从如下几方面展开研究工作。首先,通过引入各种合理的先验(领域知识),构建更准确的解混模型,抑制不合理的解空间,进而得到较好的解混结果。其次,通过引入鲁棒性度量,防止过大误差主导目标函数,尽可能降低外点通道的影响。第三,提出一种快速子集选择方法,提供一套全新的方式理解(求解)高光谱端元;此外子集选择方法能去除外点通道的影响。通过上述工作,建立更加准确的模型,提升了模型的鲁棒性,最终取得更加准确的解混结果。同时,提供了一种新颖的方式理解端元的含义。
Other AbstractUnsupervised hyperspectral unmixing (i.e., UHU) is one of the hottest topics in the remote sensing image processing, which is one of the most important tools to quantitatively analyze the hyperspectral images from remote sensing. It plays a fundamental role in a wide range of applications, such as hyperspectral visualization and understanding, hyperspectral compression and reconstruction, detection and identification substances in the scene, hyperspectral enhancement and high-resolution hyperspectral imaging etc. This task is still highly challenging due to the following three issues: (1) since both endmember and abundance are unknown, the solution space for the UHU models is really large; (2) it is very easy for hyperspectral images to be badly degraded by various kinds of noises, resulting in many outlier channels; (3) when there is no prior knowledge on hyperspectral images, the notation of pure material (i.e., endmembers) is subjective and problem dependent. To address the above the above three issues, reasonable prior knowledge is proposed to restrict the solution space, or even to bias the solution toward good stationary points. Then, various robust measures are employed for the representation loss, preventing large errors from dominating our objective. Besides, we propose an accelerated robust subset selection method and elaborate its applications for the UHU task. The main contributions of this paper are summarized as follows: 1. We propose a structured sparse regularized method, for the UHU problem, from two aspects. First, we incorporate a graph Laplacian to encode the manifold structures embedded in the hyperspectral data space. In this way, the highly similar neighboring pixels can be grouped together. Second, the lasso penalty is employed for the fact that pixels in the same manifold structure are sparsely mixed by a common set of relevant bases. These two factors act as a new structured sparse constraint. With this constraint, our method can learn a compact space, where highly similar pixels are grouped to share correlated sparse representations. 2. We propose a UHU method via Data-guided Sparsity. To reduce the solution space, many methods have been proposed by exploiting various priors. In practice, these priors would easily lead to some unsuitable local minima. This is because they are achieved by applying an identical strength of constraint to all the factors, which does not hold in practice. To overcome this limitation, we propose a ...
Other Identifier201218014628087
Document Type学位论文
Recommended Citation
GB/T 7714
朱飞云. 无监督高光谱图像解混方法研究[D]. 中国科学院自动化研究所. 中国科学院大学,2015.
Files in This Item:
File Name/Size DocType Version Access License
CASIA_20121801462808(9311KB) 暂不开放CC BY-NC-SAApplication Full Text
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[朱飞云]'s Articles
Baidu academic
Similar articles in Baidu academic
[朱飞云]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[朱飞云]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.