Remote sensing of environment research has explored the benefits of using synthetic aperture radar imagery systems for a wide range of land and marine applications since these systems are not affected by weather conditions and therefore are operable both daytime and nighttime. The design of image processing techniques for  synthetic aperture radar applications requires tests and validation on real and synthetic images. The GRSS benchmark database supports the desing and analysis of algorithms to deal with SAR and PolSAR data.

Last Updated On: 
Tue, 11/12/2019 - 10:38
Citation Author(s): 
Nobre, R. H.; Rodrigues, F. A. A.; Rosa, R.; Medeiros, F.N.; Feitosa, R., Estevão, A.A., Barros, A.S.

The proposed dataset, termed PC-Urban (Urban Point Cloud), is captured with an Ouster LiDAR sensor with 64 channels. The sensor is installed on an SUV that drives through the downtown of Perth, Western Australia (WA), Australia. The dataset comprises over 4.3 billion points captured for 66K sensor frames. The labelled data is organized as registered and raw point cloud frames, where the former has a different number of registered consecutive frames. We provide 25 class labels in the dataset covering 23 million points and 5K instances.


The SPYSTUF hyperspectral data contains high spatial and spectral resolution Aisa Eagle II (visible to near infrared, 400-900 nm) airborne imaging spectrometer above a Hyytiälä forest research station hosting the SMEAR II (Station for Measuring Ecosystem-Atmosphere Relations, 61°50' N, 24°17' E) on 3 July 2015. The spectral resolution of the data is 4.6 nm, and the spatial resolution 0.6 m.


SPYSTUF hyperspectral data

Matti Mõttus, Vincent Markiet, Rocío Hernández-Clemente, Viljami Perheentupa, Titta Majasalmi

The SPYSTUF hyperspectral data contains high spatial and spectral resolution Aisa Eagle II (visible to near infrared, 400-900 nm) airborne imaging spectrometer above a Hyytiälä forest research station hosting the SMEAR II (Station for Measuring Ecosystem-Atmosphere Relations, 61°50' N, 24°17' E) on 3 July 2015. The spectral resolution of the data is 4.6 nm, and the spatial resolution 0.6 m. The data are partly multiangular with the sensor tilted 30° off-nadir for two flight lines, resulting in measurements with the angle between the directions to the sensor and the sun of 19° (closest to hotspot), 55° (nadir) and 76° (dark spot). The data are processed to top-of-canopy geolocated reflectance factors and mosaicked. All mosaicked data were collected with the sensor pointing approximately nadir. The hyperspectral imagery is accompanied by data on basic forest variables and optical LAI from 20 plots in the image area, determined within approx. one week around the airborne acquisition.

The data were obtained between 10:44 and 12:20 (GMT+3) at approximately 1 km altitude above the ground with flight lines consecutively in the northwestern and southeastern directions to minimize BRF effects. The Aisa Eagle II sensor had a field of view (FOV) of 37.5° divided between 1024 pixels. The average solar zenith angle was 48°, the photosynthetic photon flux density ranged from 1285 to 1493 μmol/(m^2*s) with a mean value of 1408 μmol/(m^2*s) (SMEAR II measurement data above the forest). The weather conditions were optimal for an airborne hyperspectral acquisition with a clear blue sky.

The collection and processing of the dataset was largely funded by the Academy of Finland (project SPYSTUF, PI M. Mõttus, grants 266152, 272989 and 303633). All authors were affiliated with the University of Helsinki, Finland, during the data acquisition. Data processing was mostly performed by Vincent Markiet and Matti Mõttus at VTT Technical Research Centre of Finland.

The multiangular data are described in detail in the publication (open access) by Markiet et al. (2017)
The mosaic is described in the publication (open access) by Markiet & Mõttus (2020)

Additional data on the imaged forests are available from external sources, e.g.

* SMEAR II weather and flux data are available via the SmartSMEAR system:

* USGS provides EO-1 Hyperion imagery coincident with the airborne data, centered on SMEAR II:
REQUEST_ID = "1890172015184_20001"
START_TIME = 2015 184 08:26:46
END_TIME = 2015 184 08:31:05

* Dataset of tree canopy structure and understory composition obtained two years earlier,
Majasalmi, T., & Rautiainen, M. (2020). Dataset of tree canopy structure and variation in understory composition
in a boreal forest site. Data in Brief, 30, [105573].
Data identification number: 10.17632/dyt4nkp583.1

Files in the project:

20150703_mosaic: The hyperspectral mosaic data (BSQ format, 16778 samples, 16255 lines, 128 bands, 16-bit signed integer), reflectance factor mutiplied by 10,000
20150703_mosaic.hdr: ENVI header file for 20150703_mosaic
forestdata.txt: forest plot data, see below for detailed contents
line01_20150703atm06mFnnGeo: off-nadir image (one flight line), in the darkspot direction (angle between sensor and sun directions 76°), reflectance factor mutiplied by 10,000
line01_20150703atm06mFnnGeo.hdr: ENVI header for line01_20150703atm06mFnnGeo
line02_20150703atm06mFnnGeo: off-nadir image (one flight line), close to the hotspot direction (angle between sensor and sun directions 19°), reflectance factor mutiplied by 10,000
markiet2017.pdf: the paper by Markiet et al. (2017) describing the multiangular data
markiet2020.pdf: the paper by Markiet and Mõttus (2020) describing the image mosaic
README.txt: this file
SPYSTUF_hyperspectral_preview.pgw: geographic information for SPYSTUF_hyperspectral_preview.png
SPYSTUF_hyperspectral_preview.png: PNG preview of the image with forest plots

Geographic projections:
All data are projected to UTM35N. See the ENVI header files for details.

Forest data in the columns of forestdata.txt:
ID: plot ID (string)
Easting_UTM35N: Easting in UTM35N projected coordinate system [units: m]
Northing_UTM35N: Northing in UTM35N projected coordinate system [m]
LAI_effective: the effective (optical) LAI of the plot as determined with LAI-2000 using a modified "VALERI cross" sampling design: eight measurement points in each cardinal direction at four and eight meters distance from the plot center point. See Majasalmi & Rautiainen (2020) for details.
LAI2000_gaps1: the mean "gaps" value for ring 1 (zenith) of LAI-2000
LAI2000_gaps2: the mean "gaps" value for ring 2 of LAI-2000
LAI2000_gaps3: the mean "gaps" value for ring 3 of LAI-2000
LAI2000_gaps4: the mean "gaps" value for ring 4 of LAI-2000
LAI2000_gaps5: the mean "gaps" value for ring 5 of LAI-2000
BA_pine: basal area of Scots pine (Pinus sylvestris) [m^2/ha]
BA_spruce: basal area of Norway spruce (Picea abies) [m^2/ha]
BA_birch: basal area of silver birch (Betula pendula) and other broadleaf species [m^2/ha]
dbh: mean diameter at breast height (1.3 m) [cm]
treeheight: mean tree height [m]
crownbase: mean height to bottom of crown (crown base) [m]


Majasalmi, Titta; Rautiainen, Miina. 2020. "Dataset of tree canopy structure and variation in understory composition
in a boreal forest site" Data in Brief, 30: 105573.

Markiet, Vincent; Mõttus, Matti. 2020. "Estimation of boreal forest floor reflectance from airborne hyperspectral data of coniferous forests" Remote Sensing of Environment 249: 112018, DOI:10.1016/j.rse.2020.112018,

Markiet, Vincent; Hernández-Clemente, Rocío; Mõttus, Matti. 2017. "Spectral Similarity and PRI Variations for a Boreal Forest Stand Using Multi-angular Airborne Imagery" Remote Sens. 9, no. 10: 1005, DOI:10.3390/rs9101005,

Data license: Creative Commons Attribution 4.0 International (CC BY 4.0)


WITH the advancement in sensor technology, huge amounts of data are being collected from various satellites. Hence, the task of target-based data retrieval and acquisition has become exceedingly challenging. Existing satellites essentially scan a vast overlapping region of the Earth using various sensing techniques, like multi-spectral, hyperspectral, Synthetic Aperture Radar (SAR), video, and compressed sensing, to name a few.


A Zero-Shot Sketch-based Inter-Modal Object Retrieval Scheme for Remote Sensing Images

Email the authors at for any query.


Classes in this dataset:


Baseball Diamond



Golf Course



Mobile home park


Parking lot



Storage tank

Tennis court


The paper is also available on ArXiv: A Zero-Shot Sketch-based Inter-Modal Object Retrieval Scheme for Remote Sensing Images


Feel free to cite the author, if the work is any help to you:


``` @InProceedings{Chaudhuri_2020_EoC, author = {Chaudhuri, Ushasi and Banerjee, Biplab and Bhattacharya, Avik and Datcu, Mihai}, title = {A Zero-Shot Sketch-based Inter-Modal Object Retrieval Scheme for Remote Sensing Images}, booktitle = {}, month = {Aug}, year = {2020} }



The DREAM (Data Rang or EArth Monitoring): a multimode database including optics, radar, DEM and OSM labels for deep machine learning purposes.

DREAM, is a multimodal remote sensing database, developed from open-source data.

The database has been created using the Google Earth Engine platform, the GDAL python library; the “pyosm” python package developed by Alexandre Mayerowitz (Airbus, France) It includes two subsets:

France  on a 10mx10 m UTM Grid:


The two datasets are stored in two separate zip files : and After decompression, each directory contain different sub directories with different areas. Each available tile is a 1024x1024 tile GeoTiffs format.

In France:

CoupleZZ_S2_date1_date2_XX_YY, Uint16 GeoTiff, UTM, RGBCoupleZZ_SRTM_V2_XX_YY Int16 GeoTiffCoupleZZ_S1_date2_date1_XX_YY  Flot32 GeoTiff 2 bands, Red:VV, Green: HVCoupleZZ_S1moy_date2__dual_XX_YY Float32 GeoTiff 2 bands, Red:VV, Green: HVCoupleZZ_OSMraster_XX_YY  Uint8 3 bands RGB GeoTIff




Networked detector systems can be deployed in urban environments to aid in the detection and localization of radiological and/or nuclear material. However, effectively responding to and interpreting a radiological alarm using spec- troscopic data alone may be hampered by a lack of situational awareness, particularly in complex environments.


As part of the 2018 IEEE GRSS Data Fusion Contest, the Hyperspectral Image Analysis Laboratory and the National Center for Airborne Laser Mapping (NCALM) at the University of Houston are pleased to release a unique multi-sensor optical geospatial representing challenging urban land-cover land-use classification task. The data were acquired by NCALM over the University of Houston campus and its neighborhood on February 16, 2017 between 16:31 and 18:18 GMT.


Data files, as well as training and testing ground truth are provided in the enclosed zip file.


BTH Trucks in Aerial Images Dataset contains videos of 17 flights across two industrial harbors' parking spaces over two years.


If you use these provided data in a publication or a scientific paper, please cite the dataset accordingly.


The data here support the retrieval results presented in the submitted paper in IEEE Transactions on Geoscience and Remote Sensing, 'First TROPOMI Retrieval of Aerosol Effective Height using O4 Absorption Band at 477 nm and Aerosol Classification'. 

The aerosol effective height (AEH) was retrieved from TROPOMI measurements based on the O4 absorption at 477 nm. The AEHs were retrieved over Northeast Asia, South Africa, and Sahara Desert during some case study periods.


This dataset contains RF signals from drone remote controllers (RCs) of different makes and models. The RF signals transmitted by the drone RCs to communicate with the drones are intercepted and recorded by a passive RF surveillance system, which consists of a high-frequency oscilloscope, directional grid antenna, and low-noise power amplifier. The drones were idle during the data capture process. All the drone RCs transmit signals in the 2.4 GHz band. There are 17 drone RCs from eight different manufacturers and ~1000 RF signals per drone RC, each spanning a duration of 0.25 ms. 


The dataset contains ~1000 RF signals in .mat format from the remote controllers (RCs) of the following drones:

  • DJI (5): Inspire 1 Pro, Matrice 100, Matrice 600*, Phantom 4 Pro*, Phantom 3 
  • Spektrum (4): DX5e, DX6e, DX6i, JR X9303
  • Futaba (1): T8FG
  • Graupner (1): MC32
  • HobbyKing (1): HK-T6A
  • FlySky (1): FS-T6
  • Turnigy (1): 9X
  • Jeti Duplex (1): DC-16.

In the dataset, there are two pairs of RCs for the drones indicated by an asterisk above, making a total of 17 drone RCs. Each RF signal contains 5 million samples and spans a time period of 0.25 ms. 

The scripts provided with the dataset defines a class to create drone RC objects and creates a database of objects as well as a database in table format with all the available information, such as make, model, raw RF signal, sampling frequency, etc. The scripts also include functions to visualize data and extract a few example features from the raw RF signal (e.g., transient signal start point). Instructions for using the scripts are included at the top of each script and can also be viewed by typing help scriptName in MATLAB command window.  

The drone RC RF dataset was used in the following papers:

  • M. Ezuma, F. Erden, C. Kumar, O. Ozdemir, and I. Guvenc, "Micro-UAV detection and classification from RF fingerprints using machine learning techniques," in Proc. IEEE Aerosp. Conf., Big Sky, MT, Mar. 2019, pp. 1-13.
  • M. Ezuma, F. Erden, C. K. Anjinappa, O. Ozdemir, and I. Guvenc, "Detection and classification of UAVs using RF fingerprints in the presence of Wi-Fi and Bluetooth interference," IEEE Open J. Commun. Soc., vol. 1, no. 1, pp. 60-79, Nov. 2019.
  • E. Ozturk, F. Erden, and I. Guvenc, "RF-based low-SNR classification of UAVs using convolutional neural networks." arXiv preprint arXiv:2009.05519, Sept. 2020.

Other details regarding the dataset and data collection and processing can be found in the above papers and attached documentation.  


Author Contributions:

  • Experiment design: O. Ozdemir and M. Ezuma
  • Data collection:  M. Ezuma
  • Scripts: F. Erden and C. K. Anjinappa
  • Documentation: F. Erden
  • Supervision, revision, and funding: I. Guvenc 



This work was supported in part by NASA through the Federal Award under Grant NNX17AJ94A, and in part by NSF under CNS-1939334 (AERPAW, one of NSF's Platforms for Advanced Wireless Research (PAWR) projects).