The DREAM (Data Rang or EArth Monitoring): a multimode database including optics, radar, DEM and OSM labels for deep machine learning purposes.

DREAM, is a multimodal remote sensing database, developed from open-source data.

The database has been created using the Google Earth Engine platform, the GDAL python library; the “pyosm” python package developed by Alexandre Mayerowitz (Airbus, France). If you want to use this dataset in your study, please cite:


The two datasets are stored in two separate zip files: and

After unzip, each directory contain different sub directories with different areas. Each available tile is a 1024x1024 tile GeoTiffs format.

In France:

  • CoupleZZ_S2_date1_date2_XX_YY (Uint16 GeoTiff, UTM, RGB)
  • CoupleZZ_SRTM_V2_XX_YY (Int16 GeoTiff)
  • CoupleZZ_S1_date2_date1_XX_YY (Float32 GeoTiff 2 bands, Red:VV, Green: HV)
  • CoupleZZ_S1moy_date2__dual_XX_YY (Float32 GeoTiff 2 bands, Red:VV, Green: HV)
  • CoupleZZ_OSMraster_XX_YY (Uint8 3 bands RGB GeoTIff)

In the USA There are directories named zoneZ that include following subdirectories

  • optique     contains    *_pauli_x***_y***_optique.tif 
    • Ex: SanAnd_09018_18038_017_180730_L090_CX_01_pauli_x000_y002_optique.tif
  • radar                            *_pauli_x***_y***.tif 
    • Ex: SanAnd_09018_18038_017_180730_L090_CX_01_pauli_x000_y002.tif
  • S1                                 *_pauli_x***_y***_S1moy.tif 
    • Ex: SanAnd_09018_18038_017_180730_L090_CX_01_pauli_x000_y002_S1moy.tif
  • S2                                 *_pauli_x***_y***_S2mosa.tif 
    • Ex: SanAnd_09018_18038_017_180730_L090_CX_01_pauli_x000_y002_S2mosa.tif
  • SRTM                           *__x***_y***_hgt.tif
    • Ex:  SanAnd_09018_18038_017_180730_L090_CX_01__x000_y002_hgt.tif




The files here support the analysis presented in the paper in IEEE Transactions on Geoscience and Remote Sensing, "Snow Property Inversion from Remote Sensing (SPIReS): A Generalized Multispectral Unmixing Approach with Examples from MODIS and Landsat 8 OLI" Spectral mixture analysis has a history in mapping snow, especially where mixed pixels prevail. Using multiple spectral bands rather than band ratios or band indices, retrievals of snow properties that affect its albedo lead to more accurate estimates than widely used age-based models of albedo evolution.


These HDF5 files contain snow cover over the Sierra Nevada USA from water year 2001-2019 using the Snow Property Inversion from Remote Sensing (SPIRES) approach. Each file covers one water year (October through September). They are stored with block compression so individual days can be read without reading the whole file. The method is described by E.H. Bair, T. Stillinger, and J. Dozier, "Snow Property Inversion from Remote Sensing (SPIReS): A generalized multispectral unmixing approach with examples from MODIS and Landsat 8 OLI," IEEE Trans. Geosci. Remote Sens., 2020 (manuscript number TGRS-2020-02003). Source code is at The projection is the Albers equaconic (also called the California Teale projection) with WGS84 datum and 500 m square pixels. The standard meridian for the projection is 120 W; the standard parallels are 34 N and 40.5 N; False Northing is -40,000,000. The h5 files can be read with several software packages. We use MATLAB. They contain: MATLAB date numbers, ISO dates in format YYYYDDD, geographic information, spacetime cubes of snow fraction, raw (unadjusted) snow fraction, grain size (um), and dust (ppmw). The spacetime cubes have a slice for each day, begin on October 1 and end on September 30.


These last decades, Earth Observation brought quantities of new perspectives from geosciences to human activity monitoring. As more data became available, artificial intelligence techniques led to very successful results for understanding remote sensing data. Moreover, various acquisition techniques such as Synthetic Aperture Radar (SAR) can also be used for problems that could not be tackled only through optical images. This is the case for weather-related disasters such as floods or hurricanes, which are generally associated with large clouds cover.


The dataset is composed of 336 sequences corresponding to areas in West and South-East Africa, Middle-East, and Australia. Each time series is located in a given folder named with the sequence ID (0001... 0336).

Two json files, S1list.json and S2list.json are provided to describe respectively the Sentinel-1 and Sentinel-2 images.The keys are the total number of images in the sequence, the folder name, the geography of the observed area, and the description of each image in the series. The SAR images description contains also the URLs to download the images.Each image is described by its acquisition date, its label (FLOODING: boolean), a boolean (FULL-DATA-COVERAGE: boolean) indicating if the area is fully or partially imaged, and the file prefix. For SAR images the orbit (ASCENDING or DESCENDING) is also indicated.

The Sentinel-2 images were obtained from the Mediaeval 2019 Multimedia Satellite Task [1] and are provided with Level 2A atmospheric correction. For one acquisition, there are 12 single-channel raster images provided corresponding to the different spectral bands.

The Sentinel-1 images were added to the dataset. The images are provided with radiometric calibration and range doppler terrain correction based on the SRTM digital elevation model. For one acquisition, two raster images are available corresponding to the polarimetry channels VV and VH.

The original dataset was split into 269 sequences for the train and 68 sequences for the test. Here all sequences are in the same folder.


To use this dataset please cite the following papers:

Flood Detection in Time Series of Optical and SAR Images, C. Rambour,N. Audebert,E. Koeniguer,B. Le Saux,  and M. Datcu, ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 2020, 1343--1346

The Multimedia Satellite Task at MediaEval2019, Bischke, B., Helber, P., Schulze, C., Srinivasan, V., Dengel, A.,Borth, D., 2019, In Proc. of the MediaEval 2019 Workshop


This dataset contains modified Copernicus Sentinel data [2018-2019], processed by ESA.

[1] The Multimedia Satellite Task at MediaEval2019, Bischke, B., Helber, P., Schulze, C., Srinivasan, V., Dengel, A.,Borth, D., 2019, In Proc. of the MediaEval 2019 Workshop


This dataset contains multispectral high resolution 1627 image patches of size 10 x 10 pixels with each pixel size of 10mx10m. These patches are generated from the Sentinel-2 (A/B) satellite images acquired during the period of October 2018 to May 2019. It covered one life cycle (12 months) of the sugarcane crop in the region of the Karnataka, India. Many parameters like plantation season, soil type, plantation type, crop variety and irrigation type that affects the growth of the sugarcane crop are considered while generating the samples.


The Contest: Goals and Organization


The 2017 IEEE GRSS Data Fusion Contest, organized by the IEEE GRSS Image Analysis and Data Fusion Technical Committee, aimed at promoting progress on fusion and analysis methodologies for multisource remote sensing data.





The 2017 Data Fusion Contest will consist in a classification benchmark. The task to perform is classification of land use (more precisely, Local Climate Zones or LCZ) in various urban environments. Several cities have been selected all over the world to test the ability of both LCZ prediction and domain adaptation. Input data are multi-temporal, multi-source and multi-mode (image and semantic layers). 5 cities are considered for training: Berlin, Hong Kong, Paris, Rome and Sao Paulo.


Each city folder contains:grid/        sampling gridlandsat_8/    Landsat 8 images at various dates (resampled at 100m res., split in selected bands)lcz/        Local Climate Zones as rasters (see below)osm_raster/    Rasters with areas (buildings, land-use, water) derived from OpenStreetMap layersosm_vector/    Vector data with OpenStreetMap zones and linessentinel_2/    Sentinel2 image (resampled at 100m res., split in selected bands)


Local Climate Zones

The lcz/ folder contains:`<city>_lcz_GT.tif`: The ground-truth for local climate zones, as a raster. It is single-band, in byte format. The pixel values range from 1 to 17 (maximum number of classes). Unclassified pixels have 0 value.`<city>_lcz_col.tif`: Color, georeferenced LCZ map, for visualization convenience only.Class nembers are the following:10 urban LCZs corresponding to various built types:

  • 1. Compact high-rise;
  • 2. Compact midrise;
  • 3. Compact low-rise;
  • 4. Open high-rise;
  • 5. Open midrise;
  • 6. Open low-rise;
  • 7. Lightweight low-rise;
  • 8. Large low-rise;
  • 9. Sparsely built;
  • 10. Heavy industry.

7 rural LCZs corresponding to various land cover types:

  • 11. Dense trees;
  • 12. Scattered trees;
  • 13. Bush and scrub;
  • 14. Low plants;
  • 15. Bare rock or paved;
  • 16. Bare soil or sand;
  • 17. Water



More info:




The 2017 IEEE GRSS Data Fusion Contest is organized by the Image Analysis and Data Fusion Technical Committee of IEEE GRSSLandsat 8 data available from the U.S. Geological Survey ( Data © OpenStreetMap contributors, available under the Open Database Licence - Original Copernicus Sentinel Data 2016 available from  the European Space Agency ( Contest is being organized in collaboration with the WUDAPT ( and GeoWIKI ( initiatives. The IADF TC chairs would like to thank the organizers and the IEEE GRSS for continuously supporting the annual Data Fusion Contest through funding and resources.


The Data Fusion Contest 2016: Goals and Organization

The 2016 IEEE GRSS Data Fusion Contest, organized by the IEEE GRSS Image Analysis and Data Fusion Technical Committee, aimed at promoting progress on fusion and analysis methodologies for multisource remote sensing data.

New multi-source, multi-temporal data including Very High Resolution (VHR) multi-temporal imagery and video from space were released. First, VHR images (DEIMOS-2 standard products) acquired at two different dates, before and after orthorectification:



After unzip, each directory contains:

  • original GeoTiff for panchromatic (VHR) and multispectral (4bands) images,

  • quick-view image for both in png format,

  • capture parameters (RPC file).