|SeNet: Structured Edge Network for Sea--Land Segmentation|
|Cheng, Dongcai; Meng, Gaofeng; Cheng, Guangliang; Pan, Chunhong
|Source Publication||IEEE Geoscience and Remote Sensing Letters
|Abstract|| Separating an optical remote sensing image into sea and land areas is very challenging yet of great importance to the coastline extraction and subsequent object detection. Traditional methods based on handcrafted feature extraction and image processing often face dilemma when confronting high resolution remote sensing images for their complicated texture and intensity distribution. In this paper, we apply the prevalent deep convolution neural networks (CNN) to the sea--land segmentation problem and make two innovations on top of the traditional structure: firstly, we propose a local smooth regularization to achieve better spatially consistent results, which frees us from the complicated morphological operations that are commonly used in traditional methods; secondly, we use a multi-task loss to simultaneously obtain the segmentation and edge detection results. The attached structured edge detection branch can further refine the segmentation result and dramatically improve edge accuracy. Experiments on a set of natural-colored images from Google Earth demonstrate the effectiveness of our approach in terms of quantitative and visual performances compared with state-of-the-art methods.|
Deconvolution Network (Deconvnet)
Local Smooth Regularization
Structured Edge Network (Senet)
Cheng, Dongcai,Meng, Gaofeng,Cheng, Guangliang,et al. SeNet: Structured Edge Network for Sea--Land Segmentation[J]. IEEE Geoscience and Remote Sensing Letters,2017(2):247-251.
Cheng, Dongcai,Meng, Gaofeng,Cheng, Guangliang,&Pan, Chunhong.(2017).SeNet: Structured Edge Network for Sea--Land Segmentation.IEEE Geoscience and Remote Sensing Letters(2),247-251.
Cheng, Dongcai,et al."SeNet: Structured Edge Network for Sea--Land Segmentation".IEEE Geoscience and Remote Sensing Letters .2(2017):247-251.
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.