Congratulations!  You have been automatically subscribed to IEEE DataPort and can access all datasets on IEEE DataPort!
First Name: 
Bertrand
Last Name: 
Le Saux

Datasets & Analysis

The Contest: Goals and Organisation

 The 2019 Data Fusion Contest, organized by the Image Analysis and Data Fusion Technical Committee (IADF TC) of the IEEE Geoscience and Remote Sensing Society (GRSS), the Johns Hopkins University (JHU), and the Intelligence Advanced Research Projects Activity (IARPA), aimed to promote research in semantic 3D reconstruction and stereo using machine intelligence and deep learning applied to satellite images.

Instructions: 
Categories:
1048 Views

The Contest: Goals and Organization

 

The 2017 IEEE GRSS Data Fusion Contest, organized by the IEEE GRSS Image Analysis and Data Fusion Technical Committee, aimed at promoting progress on fusion and analysis methodologies for multisource remote sensing data.

 

Instructions: 

 

Overview

The 2017 Data Fusion Contest will consist in a classification benchmark. The task to perform is classification of land use (more precisely, Local Climate Zones or LCZ) in various urban environments. Several cities have been selected all over the world to test the ability of both LCZ prediction and domain adaptation. Input data are multi-temporal, multi-source and multi-mode (image and semantic layers). 5 cities are considered for training: Berlin, Hong Kong, Paris, Rome and Sao Paulo.

Content

Each city folder contains:grid/        sampling gridlandsat_8/    Landsat 8 images at various dates (resampled at 100m res., split in selected bands)lcz/        Local Climate Zones as rasters (see below)osm_raster/    Rasters with areas (buildings, land-use, water) derived from OpenStreetMap layersosm_vector/    Vector data with OpenStreetMap zones and linessentinel_2/    Sentinel2 image (resampled at 100m res., split in selected bands)

 

Local Climate Zones

The lcz/ folder contains:`<city>_lcz_GT.tif`: The ground-truth for local climate zones, as a raster. It is single-band, in byte format. The pixel values range from 1 to 17 (maximum number of classes). Unclassified pixels have 0 value.`<city>_lcz_col.tif`: Color, georeferenced LCZ map, for visualization convenience only.Class nembers are the following:10 urban LCZs corresponding to various built types:

  • 1. Compact high-rise;
  • 2. Compact midrise;
  • 3. Compact low-rise;
  • 4. Open high-rise;
  • 5. Open midrise;
  • 6. Open low-rise;
  • 7. Lightweight low-rise;
  • 8. Large low-rise;
  • 9. Sparsely built;
  • 10. Heavy industry.

7 rural LCZs corresponding to various land cover types:

  • 11. Dense trees;
  • 12. Scattered trees;
  • 13. Bush and scrub;
  • 14. Low plants;
  • 15. Bare rock or paved;
  • 16. Bare soil or sand;
  • 17. Water

 

More...

More info:http://www.grss-ieee.org/community/technical-committees/data-fusion/data-fusion-contest/

Discuss:https://www.linkedin.com/groups/IEEE-Geoscience-Remote-Sensing-Society-3678437

 

Acknowledgments

The 2017 IEEE GRSS Data Fusion Contest is organized by the Image Analysis and Data Fusion Technical Committee of IEEE GRSSLandsat 8 data available from the U.S. Geological Survey (https://www.usgs.gov/).OpenStreetMap Data © OpenStreetMap contributors, available under the Open Database Licence - http://www.openstreetmap.org/copyright. Original Copernicus Sentinel Data 2016 available from  the European Space Agency (https://sentinel.esa.int).The Contest is being organized in collaboration with the WUDAPT (http://www.wudapt.org/) and GeoWIKI (http://geo-wiki.org/) initiatives. The IADF TC chairs would like to thank the organizers and the IEEE GRSS for continuously supporting the annual Data Fusion Contest through funding and resources.

Categories:
252 Views

The Data Fusion Contest 2016: Goals and Organization

The 2016 IEEE GRSS Data Fusion Contest, organized by the IEEE GRSS Image Analysis and Data Fusion Technical Committee, aimed at promoting progress on fusion and analysis methodologies for multisource remote sensing data.

New multi-source, multi-temporal data including Very High Resolution (VHR) multi-temporal imagery and video from space were released. First, VHR images (DEIMOS-2 standard products) acquired at two different dates, before and after orthorectification:

Instructions: 

 

After unzip, each directory contains:

  • original GeoTiff for panchromatic (VHR) and multispectral (4bands) images,

  • quick-view image for both in png format,

  • capture parameters (RPC file).

 

Categories:
470 Views

We introduce a new robotic RGBD dataset with difficult luminosity conditions: ONERA.ROOM. It comprises RGB-D data (as pairs of images) and corresponding annotations in PASCAL VOC format (xml files)

It aims at People detection, in (mostly) indoor and outdoor environments. People in the field of view can be standing, but also lying on the ground as after a fall.

Instructions: 

To facilitate use of some deep learning softwares, a folder tree with relative symbolic link (thus avoiding extra space) will gather all the sequences in three folders : | |— image |        | — sequenceName0_imageNumber_timestamp0.jpg |        | — sequenceName0_imageNumber_timestamp1.jpg |        | — sequenceName0_imageNumber_timestamp2.jpg |        | — sequenceName0_imageNumber_timestamp3.jpg |        | — … | |— depth_8bits |        | — sequenceName0_imageNumber_timestamp0.png |        | — sequenceName0_imageNumber_timestamp1.png |        | — sequenceName0_imageNumber_timestamp2.png |        | — sequenceName0_imageNumber_timestamp3.png |        | — … | |— annotations |        | — sequenceName0_imageNumber_timestamp0.xml |        | — sequenceName0_imageNumber_timestamp1.xml |        | — sequenceName0_imageNumber_timestamp2.xml |        | — sequenceName0_imageNumber_timestamp3.xml |        | — … |

Categories:
85 Views

The Dataset

The Onera Satellite Change Detection dataset addresses the issue of detecting changes between satellite images from different dates.

Instructions: 

Onera Satellite Change Detection dataset

 

##################################################

Authors: Rodrigo Caye Daudt, rodrigo.daudt@onera.fr

Bertrand Le Saux, bls@ieee.org

Alexandre Boulch, alexandre.boulch@valeo.ai

Yann Gousseau, yann.gousseau@telecom-paristech.fr

 

##################################################

About: This dataset contains registered pairs of 13-band multispectral satellite images obtained by the Sentinel-2 satellites of the Copernicus program. Pixel-level urban change groundtruth is provided. In case of discrepancies in image size, the older images with resolution of 10m per pixel is used. Images vary in spatial resolution between 10m, 20m and 60m. For more information, please refer to Sentinel-2 documentation.

 

For each location, folders imgs_1_rect and imgs_2_rect contain the same images as imgs_1 and imgs_2 resampled at 10m resolution and cropped accordingly for ease of use. The proposed split into train and test images is contained in the train.txt and test.txt files.

For downloading and cropping the images, the Medusa toolbox was used: https://github.com/aboulch/medusa_tb

For precise registration of the images, the GeFolki toolbox was used. https://github.com/aplyer/gefolki

 

##################################################

Labels: The train labels are available in two formats, a .png visualization image and a .tif label image. In the png image, 0 means no change and 255 means change. In the tif image, 0 means no change and 1 means change.

<ROOT_DIR>//cm/ contains: - cm.png - -cm.tif

Please note that prediction images should be formated as the -cm.tif rasters for upload and evaluation on DASE (http://dase.grss-ieee.org/).

(Update June 2020) Alternatively, you can use the test labels which are now provided in a separate archive, and compute standard metrics using the python notebook provided in this repo, along with a fulll script to train and classify fully-convolutional networks for change detection: https://github.com/rcdaudt/fully_convolutional_change_detection 

 

##################################################

Citation: If you use this dataset for your work, please use the following citation:

@inproceedings{daudt-igarss18,

author = {{Caye Daudt}, R. and {Le Saux}, B. and Boulch, A. and Gousseau, Y.},

title = {Urban Change Detection for Multispectral Earth Observation Using Convolutional Neural Networks},

booktitle = {IEEE International Geoscience and Remote Sensing Symposium (IGARSS'2018)},

venue = {Valencia, Spain},

month = {July},

year = {2018},

}

 

##################################################

Copyright: Sentinel Images: This dataset contains modified Copernicus data from 2015-2018. Original Copernicus Sentinel Data available from the European Space Agency (https://sentinel.esa.int).

Change labels: Change maps are released under Creative-Commons BY-NC-SA. For commercial purposes, please contact the authors.

 

Categories:
5115 Views