2018 IEEE GRSS Data Fusion Challenge – Fusion of Multispectral LiDAR and Hyperspectral Data
As part of the 2018 IEEE GRSS Data Fusion Contest, the Hyperspectral Image Analysis Laboratory and the National Center for Airborne Laser Mapping (NCALM) at the University of Houston are pleased to release a unique multi-sensor optical geospatial representing challenging urban land-cover land-use classification task. The data were acquired by NCALM over the University of Houston campus and its neighborhood on February 16, 2017 between 16:31 and 18:18 GMT. A detailed report summarizing the processing undertaken on the raw data is available here. Sensors used in this campaign include an Optech Titan MW (14SEN/CON340) with integrated camera (a multispectral LIDAR sensor operating at three different laser wavelengths), a DiMAC ULTRALIGHT+ (a very high resolution color imager), and an ITRES CASI 1500 (a hyperspectral imager). The sensors were aboard a Piper PA-31-350 Navajo Chieftan aircraft.
The data we provide include:
- Multispectral-LiDAR point cloud data at 1550 nm, 1064 nm, and 532 nm; Intensity rasters from first return per channel and DSMs at a 50-cm GSD.
- Hyperspectral data covering a 380-1050 nm spectral range with 48 bands at a 1-m GSD.
- Very high resolution RGB imagery at a 5-cm GSD. The image is organized into several separate tiles.
In addition to the optical data, ground truth representing 20 urban land-cover/land-use classes is available (provided as raster at a 0.5-m GSD, superimposable to airborne images.) Details are provided at https://machinelearning.ee.uh.edu/2018-ieee-grss-data-fusion-challenge-f...
Data files, as well as training and testing ground truth are provided in the enclosed zip file.