Institutional Repository of Chinese Acad Sci, Inst Automat, Res Ctr Precis Sensing & Control, Beijing 100190, Peoples R China
Radar and Rain Gauge Merging-Based Precipitation Estimation via Geographical-Temporal Attention Continuous Conditional Random Field | |
Tang, Yongqiang1,2; Yang, Xuebing1,2; Zhang, Wensheng1,2; Zhang, Guoping3 | |
发表期刊 | IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING |
2018-09-01 | |
卷号 | 56期号:9页码:5558-5571 |
文章类型 | Article |
摘要 | An accurate, high-resolution precipitation estimation based on rain gauge and radar observations is essential in various meteorological applications. Although numerous studies have demonstrated the effectiveness of merging two information sources rather than using separate sources, approaches that simultaneously consider the local radar reflectivity, the neighborhood rain gauge observations, and the temporal information are much less common. In this paper, we present a new framework for real-time quantitative precipitation estimation (QPE). By formulating the QPE as a continuous conditional random field (CCRF) learning problem, the spatiotemporal correlations of precipitation can be explored more thoroughly. Based on the CCRF, we further improve the accuracy of the precipitation estimation by introducing geographical and temporal attention. Specifically, we first present a data-driven weighting scheme to merge the first law of geography into the proposed framework, and hence, the neighborhood sample closer to the estimated grid can receive more attention. Second, the temporal attention penalizes the similarity between two adjacent timestamps via the discrepancy of two-view estimates, which can model the local temporal consistency and tolerate some drastic changes. A sufficient evaluation is conducted on 11 rainfall processes that occurred in 2015, and the results confirm the advantage of our proposal for real-time precipitation estimation. |
关键词 | Continuous Conditional Random Field (Ccrf) Merging Method Precipitation Estimation Spatiotemporal Correlation |
WOS标题词 | Science & Technology ; Physical Sciences ; Technology |
DOI | 10.1109/TGRS.2018.2819802 |
关键词[WOS] | INTERPOLATION ; PREDICTION ; ALGORITHM ; MODEL ; RECOGNITION ; EVENT |
收录类别 | SCI |
语种 | 英语 |
项目资助者 | National Natural Science Foundation of China(U1636220 ; Beijing Natural Science Foundation(4182067) ; 61432008 ; 61602482 ; 61772524) |
WOS研究方向 | Geochemistry & Geophysics ; Engineering ; Remote Sensing ; Imaging Science & Photographic Technology |
WOS类目 | Geochemistry & Geophysics ; Engineering, Electrical & Electronic ; Remote Sensing ; Imaging Science & Photographic Technology |
WOS记录号 | WOS:000443147600047 |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/21826 |
专题 | 精密感知与控制研究中心_人工智能与机器学习 精密感知与控制研究中心 |
通讯作者 | Zhang, Wensheng |
作者单位 | 1.Chinese Acad Sci, Inst Automat, Res Ctr Precis Sensing & Control, Beijing 100190, Peoples R China 2.Univ Chinese Acad Sci, Beijing 101408, Peoples R China 3.China Meteorol Adm, Publ Meteorol Serv Ctr, Joint Lab Meteorol Data & Machine Learning, Beijing 100081, Peoples R China |
第一作者单位 | 精密感知与控制研究中心 |
通讯作者单位 | 精密感知与控制研究中心 |
推荐引用方式 GB/T 7714 | Tang, Yongqiang,Yang, Xuebing,Zhang, Wensheng,et al. Radar and Rain Gauge Merging-Based Precipitation Estimation via Geographical-Temporal Attention Continuous Conditional Random Field[J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING,2018,56(9):5558-5571. |
APA | Tang, Yongqiang,Yang, Xuebing,Zhang, Wensheng,&Zhang, Guoping.(2018).Radar and Rain Gauge Merging-Based Precipitation Estimation via Geographical-Temporal Attention Continuous Conditional Random Field.IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING,56(9),5558-5571. |
MLA | Tang, Yongqiang,et al."Radar and Rain Gauge Merging-Based Precipitation Estimation via Geographical-Temporal Attention Continuous Conditional Random Field".IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING 56.9(2018):5558-5571. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | ||
2018--TGRS--Radar an(3564KB) | 期刊论文 | 出版稿 | 开放获取 | CC BY-NC-SA | 浏览 下载 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论