Figure Closed Access
Rasti, Behnood; Ghamisi, Pedram
<?xml version='1.0' encoding='utf-8'?> <oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd"> <dc:creator>Rasti, Behnood</dc:creator> <dc:creator>Ghamisi, Pedram</dc:creator> <dc:date>2020-12-01</dc:date> <dc:description>The amount of remote sensing and ancillary datasets captured by diverse airborne and spaceborne sensors has been tremendously increased, which opens up the possibility of utilizing multimodal datasets to improve the performance of processing approaches with respect to the application at hand. However, developing a generic framework with high generalization capability that can effectively fuse diverse datasets is a challenging task since the current approaches are usually only applicable to two specific sensors for data fusion. In this paper, we propose an accurate fusion-based technique called SubFus with capability to integrate diverse remote sensing data for land cover classification. Here, we assume that a high dimensional multisensor dataset can be represented fused features that live in a lower-dimensional space. The proposed classification methodology includes three main stages. First, spatial information is extracted by using spatial filters (i.e., morphology filters). Then, a novel low- rank minimization problem is proposed to represent the multisensor datasets in subspaces using fused features. The fused features in the lower-dimensional subspace are estimated using a novel iterative algorithm based on the alternative direction method of multipliers. Third, the final classification map is produced by applying a supervised spectral classifier (i.e., random forest) on the fused features. In the experiments, the proposed method is applied to a three-sensor (RGB, multispectral LiDAR, and hyperspectral images) dataset captured over the area of the University of Houston, the USA, and a two-sensor (hyperspectral and LiDAR) dataset captured over the city of Trento, Italy. The land-cover maps generated using SubFus are evaluated based on classification accuracies. Experimental results obtained by SubFus confirm considerable improvements in terms of classification accuracies compared with the other methods used in the experiments. The proposed fusion approach obtains 85.32% and 99.25% in terms of overall classification accuracy on the Houston (the training portion of the dataset distributed for the data fusion contest of 2018) and trento datasets, respectively.</dc:description> <dc:identifier>https://rodare.hzdr.de/record/689</dc:identifier> <dc:identifier>10.14278/rodare.689</dc:identifier> <dc:identifier>oai:rodare.hzdr.de:689</dc:identifier> <dc:relation>url:https://www.hzdr.de/publications/Publ-31961</dc:relation> <dc:relation>url:https://www.hzdr.de/publications/Publ-32032</dc:relation> <dc:relation>doi:10.14278/rodare.688</dc:relation> <dc:relation>url:https://rodare.hzdr.de/communities/rodare</dc:relation> <dc:rights>info:eu-repo/semantics/closedAccess</dc:rights> <dc:title>Data for: Remote Sensing Image Classification Using Subspace Sensor Fusion</dc:title> <dc:type>info:eu-repo/semantics/other</dc:type> <dc:type>image-figure</dc:type> </oai_dc:dc>
All versions | This version | |
---|---|---|
Views | 369 | 369 |
Downloads | 0 | 0 |
Data volume | 0 Bytes | 0 Bytes |
Unique views | 306 | 306 |
Unique downloads | 0 | 0 |