Figure Closed Access

Data for: Remote Sensing Image Classification Using Subspace Sensor Fusion

Rasti, Behnood; Ghamisi, Pedram


DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd">
  <identifier identifierType="DOI">10.14278/rodare.689</identifier>
  <creators>
    <creator>
      <creatorName>Rasti, Behnood</creatorName>
      <givenName>Behnood</givenName>
      <familyName>Rasti</familyName>
      <nameIdentifier nameIdentifierScheme="ORCID" schemeURI="http://orcid.org/">0000-0002-1091-9841</nameIdentifier>
    </creator>
    <creator>
      <creatorName>Ghamisi, Pedram</creatorName>
      <givenName>Pedram</givenName>
      <familyName>Ghamisi</familyName>
      <nameIdentifier nameIdentifierScheme="ORCID" schemeURI="http://orcid.org/">0000-0003-1203-741X</nameIdentifier>
    </creator>
  </creators>
  <titles>
    <title>Data for: Remote Sensing Image Classification Using Subspace Sensor Fusion</title>
  </titles>
  <publisher>Rodare</publisher>
  <publicationYear>2020</publicationYear>
  <dates>
    <date dateType="Issued">2020-12-01</date>
  </dates>
  <resourceType resourceTypeGeneral="Image">Figure</resourceType>
  <alternateIdentifiers>
    <alternateIdentifier alternateIdentifierType="url">https://rodare.hzdr.de/record/689</alternateIdentifier>
  </alternateIdentifiers>
  <relatedIdentifiers>
    <relatedIdentifier relatedIdentifierType="URL" relationType="IsReferencedBy">https://www.hzdr.de/publications/Publ-31961</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="URL" relationType="IsIdenticalTo">https://www.hzdr.de/publications/Publ-32032</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsVersionOf">10.14278/rodare.688</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="URL" relationType="IsPartOf">https://rodare.hzdr.de/communities/rodare</relatedIdentifier>
  </relatedIdentifiers>
  <rightsList>
    <rights rightsURI="info:eu-repo/semantics/closedAccess">Closed Access</rights>
  </rightsList>
  <descriptions>
    <description descriptionType="Abstract">&lt;p&gt;The amount of remote sensing and ancillary datasets captured by diverse airborne and spaceborne sensors has been tremendously increased, which opens up the possibility of utilizing multimodal datasets to improve the performance of processing approaches with respect to the application at hand. However, developing a generic framework with high generalization capability that can effectively fuse diverse datasets is a challenging task since the current approaches are usually only applicable to two specific sensors for data fusion. In this paper, we propose an accurate fusion-based technique called SubFus with capability to integrate diverse remote sensing data for land cover classification. Here, we assume that a high dimensional multisensor dataset can be represented fused features that live in a lower-dimensional space. The proposed classification methodology includes three main stages. First, spatial information is extracted by using spatial filters (i.e., morphology filters). Then, a novel low- rank minimization problem is proposed to represent the multisensor datasets in subspaces using fused features. The fused features in the lower-dimensional subspace are estimated using a novel iterative algorithm based on the alternative direction method of multipliers. Third, the final classification map is produced by applying a supervised spectral classifier (i.e., random forest) on the fused features. In the experiments, the proposed method is applied to a three-sensor (RGB, multispectral LiDAR, and hyperspectral images) dataset captured over the area of the University of Houston, the USA, and a two-sensor (hyperspectral and LiDAR) dataset captured over the city of Trento, Italy. The land-cover maps generated using SubFus are evaluated based on classification accuracies. Experimental results obtained by SubFus confirm considerable improvements in terms of classification accuracies compared with the other methods used in the experiments. The proposed fusion approach obtains 85.32% and 99.25% in terms of overall classification accuracy on the Houston (the training portion of the dataset distributed for the data fusion contest of 2018) and trento datasets, respectively.&lt;/p&gt;</description>
  </descriptions>
</resource>
369
0
views
downloads
All versions This version
Views 369369
Downloads 00
Data volume 0 Bytes0 Bytes
Unique views 306306
Unique downloads 00

Share

Cite as