Remote sensing of environment research has explored the benefits of using synthetic aperture radar imagery systems for a wide range of land and marine applications since these systems are not affected by weather conditions and therefore are operable both daytime and nighttime. The design of image processing techniques for  synthetic aperture radar applications requires tests and validation on real and synthetic images. The GRSS benchmark database supports the desing and analysis of algorithms to deal with SAR and PolSAR data.

Last Updated On: 
Tue, 11/12/2019 - 10:38
Citation Author(s): 
Nobre, R. H.; Rodrigues, F. A. A.; Rosa, R.; Medeiros, F.N.; Feitosa, R., Estevão, A.A., Barros, A.S.

This dataset contains satellite images of areas of interest surrounding 30 different European airports. It also provides ground-truth annotations of flying airplanes in part of those images to support future research involving flying airplane detection. This dataset is part of the work entitled "Measuring economic activity from space: a case study using flying airplanes and COVID-19" published by the IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing. It contains modified Sentinel-2 data processed by Euro Data Cube.

Instructions: 

Details regarding dataset collection and usage are provided at https://github.com/maups/covid19-custom-script-contest

Categories:
145 Views

This dataset is created for ocean front evolution trend recognition and tracking. 

Categories:
11 Views

This dataset  provide researchers a benchmark to develop applicable and adaptive harbor detection algorithms.

Categories:
37 Views

Sea ice classification spatiotemporal dataset.

Instructions: 

Sea ice classification spatiotemporal dataset.

Categories:
53 Views

<p>This is the image dataset for satellite image processing&nbsp; which is a collection therml infrared and multispectral images .</p>

Instructions: 

Dataset images
Thermal infrared images and multispectral images
image size:512x512
format:
image:.tiff
file :.h5

Categories:
163 Views

The data provided corresponds to the open-source codes and reference images from a computer interface for real-time gait biofeedback using a Wearable Integrated Sensor System for Data Acquisition.This data is the supplmementary material of the study titled Computer Interface for Real-time Gait Biofeedback using a Wearable Integrated Sensor System for Data Acquisition, accepted for publication in the IEEE Transactions on Human-Machine Systems journal (June 2021). 

Categories:
91 Views

In a context of rapid urban evolution, there is a need of surveying cities. Nowadays predictive models based on machine learning require large amount of data to be trained, hence the necessity of providing some public dataset allowing to follow up urban evolution. While most of changes occurs onto the vertical axis, there is no public change detection dataset composed of 3D point clouds and directly annotated according to the change at point level yet.

Instructions: 

Urban Point Clouds simulator

We have developed a simulator to generate time series of point clouds (PCs) for urban datasets. Given a 3D model of a city, the simulator allows us to introduce random changes in the model and generates a synthetic aerial LiDAR Survey (ALS) above the city. The 3D model is in practice issued from a real city, e.g. with a Level of Detail 2 (LoD2) precision. From this model, we extract each existing building as well as the ground. By adding or removing buildings in the model, we can simulate construction or demolition of buildings. Notice that depending on area the ground is not necessarily flat. The simulator allows us to obtain as many 3D PCs over urban changed areas as needed. It could be useful especially for deep learning supervised approaches that require lots of training dates. Moreover, the created PCs are all directly annotated by the simulator according to the changes, thus no time-consuming manual annotation needs to be done with this process.

For each obtained model, the ALS simulation is performed thanks to a flight plan and ray tracing with the Visualisation ToolKit (VTK) python library.  Space between flight lines is computed in accordance to predefined parameters such as resolution, covering between swaths and scanning angle. Following this computation, a flight plan is set with a random starting position and direction of flight in order to introduce more variability between two acquisitions. Moreover, Gaussian noise can be added to simulate errors and lack of precision in LiDAR range measuring and scan direction.

Dataset Description

To conduct fair qualitative and quantitative evaluation of PC change detection techniques, we have build some datasets based on LoD2 models of the first and second districts of Lyon  (https://geo.data.gouv.fr/datasets/0731989349742867f8e659b4d70b707612bece89), France. For each simulation, buildings have been added or removed to introduce changes in the model and to generate a large number of pairs of PCs. We also consider various initial states across simulations, and randomly update the set of buildings from the first date through random addition or deletion of  buildings to create the second landscape. In addition, flight starting position and direction are always set randomly. As a consequence, the acquisition patterns will not be the same between generated PCs, thus each acquisition may not have exactly the same visible or hidden parts.

From terrestrial LiDAR surveying to photogrammetric acquisition by satellite images, there exist many different types of sensors and acquisition pipelines to obtain 3D point clouds for urban areas, resulting in PCs with different characteristics. {By providing different acquisition parameters to our simulator}, our goal was to provide a variety of sub-datasets with heterogeneous qualities to reproduce the real variability of LiDAR sensors or to mimic datasets coming from a photogrammetric pipeline with satellite images (by using a tight scan angle with high noise). Thus, we generated the following sub-datasets:

  • ALS with low resolution, low noise for both dates
  • ALS with high resolution, low noise for both dates
  • ALS with low resolution, high noise for both dates
  • ALS with low resolution, high noise, tight scan angle (mimicking photogrammetric acquisition from satellite images) for both dates
  • Multi-sensor data, with low resolution, high noise at date 1, and high resolution, low~noise at date 2

Notice that sub-datasets 3 and 4 are quite similar but the latter provides less visible facades, thanks to the smaller scanning angle and overlapping percentage.

Finally, for the first configuration (ALS low resolution, low noise), we provided the 3~following different training sets:

  • Small training set: 1 simulation
  • Normal training set: 10 simulations
  • Large training set: 50 simulations

More details about the configuration of acquisition are provided in the documentation file and in the publication Change Detection in Urban Point Clouds: An Experimental Comparison with Simulated 3D Datasets, de Gélis et al. (2021)

Technical details

All PCs are available at PLY format. Each train, val, test folder contains sub-folders containing pairs of PCs : pointCloud0.ply and pointCloud1.ply for both first and second dates.

Each ply file contain the coordinates X Y Z of each points and the label:

  • 0 for unchanged points
  • 1 for points on a new building
  • 2 for for points on a destruction.

The label is given in a scalar field named label_ch. Notice that the first PC (pointCloud0.ply) has a label field even if it is set at 0 for every points because change are set in comparison to previous date.

Citation

If you use this dataset for your work, please use the following citation:

@article{degelis2021change,  title={Change Detection in Urban Point Clouds: An Experimental Comparison with Simulated 3D Datasets},  author={{de G\'elis}, I. and Lef\`evre, S. and Corpetti, T. },  journal={Remote Sensing},  volume={13},  pages={2629},  year={2021},  publisher={Multidisciplinary Digital Publishing Institute}}

Categories:
143 Views

The proposed dataset, termed PC-Urban (Urban Point Cloud), is captured with an Ouster LiDAR sensor with 64 channels. The sensor is installed on an SUV that drives through the downtown of Perth, Western Australia (WA), Australia. The dataset comprises over 4.3 billion points captured for 66K sensor frames. The labelled data is organized as registered and raw point cloud frames, where the former has a different number of registered consecutive frames. We provide 25 class labels in the dataset covering 23 million points and 5K instances.

Categories:
271 Views

The SPYSTUF hyperspectral data contains high spatial and spectral resolution Aisa Eagle II (visible to near infrared, 400-900 nm) airborne imaging spectrometer above a Hyytiälä forest research station hosting the SMEAR II (Station for Measuring Ecosystem-Atmosphere Relations, 61°50' N, 24°17' E) on 3 July 2015. The spectral resolution of the data is 4.6 nm, and the spatial resolution 0.6 m.

Instructions: 

SPYSTUF hyperspectral data

Authors:
Matti Mõttus, Vincent Markiet, Rocío Hernández-Clemente, Viljami Perheentupa, Titta Majasalmi

The SPYSTUF hyperspectral data contains high spatial and spectral resolution Aisa Eagle II (visible to near infrared, 400-900 nm) airborne imaging spectrometer above a Hyytiälä forest research station hosting the SMEAR II (Station for Measuring Ecosystem-Atmosphere Relations, 61°50' N, 24°17' E) on 3 July 2015. The spectral resolution of the data is 4.6 nm, and the spatial resolution 0.6 m. The data are partly multiangular with the sensor tilted 30° off-nadir for two flight lines, resulting in measurements with the angle between the directions to the sensor and the sun of 19° (closest to hotspot), 55° (nadir) and 76° (dark spot). The data are processed to top-of-canopy geolocated reflectance factors and mosaicked. All mosaicked data were collected with the sensor pointing approximately nadir. The hyperspectral imagery is accompanied by data on basic forest variables and optical LAI from 20 plots in the image area, determined within approx. one week around the airborne acquisition.

The data were obtained between 10:44 and 12:20 (GMT+3) at approximately 1 km altitude above the ground with flight lines consecutively in the northwestern and southeastern directions to minimize BRF effects. The Aisa Eagle II sensor had a field of view (FOV) of 37.5° divided between 1024 pixels. The average solar zenith angle was 48°, the photosynthetic photon flux density ranged from 1285 to 1493 μmol/(m^2*s) with a mean value of 1408 μmol/(m^2*s) (SMEAR II measurement data above the forest). The weather conditions were optimal for an airborne hyperspectral acquisition with a clear blue sky.

The collection and processing of the dataset was largely funded by the Academy of Finland (project SPYSTUF, PI M. Mõttus, grants 266152, 272989 and 303633). All authors were affiliated with the University of Helsinki, Finland, during the data acquisition. Data processing was mostly performed by Vincent Markiet and Matti Mõttus at VTT Technical Research Centre of Finland.

The multiangular data are described in detail in the publication (open access) by Markiet et al. (2017)
The mosaic is described in the publication (open access) by Markiet & Mõttus (2020)

Additional data on the imaged forests are available from external sources, e.g.

* SMEAR II weather and flux data are available via the SmartSMEAR system: https://smear.avaa.csc.fi

* USGS provides EO-1 Hyperion imagery coincident with the airborne data, centered on SMEAR II: https://earthexplorer.usgs.gov
REQUEST_ID = "1890172015184_20001"
ACQUISITION_DATE = 2015-07-03
START_TIME = 2015 184 08:26:46
END_TIME = 2015 184 08:31:05

* Dataset of tree canopy structure and understory composition obtained two years earlier, https://data.mendeley.com/datasets/dyt4nkp583/1
Majasalmi, T., & Rautiainen, M. (2020). Dataset of tree canopy structure and variation in understory composition
in a boreal forest site. Data in Brief, 30, [105573]. https://doi.org/10.1016/j.dib.2020.105573
Data identification number: 10.17632/dyt4nkp583.1

Files in the project:

20150703_mosaic: The hyperspectral mosaic data (BSQ format, 16778 samples, 16255 lines, 128 bands, 16-bit signed integer), reflectance factor mutiplied by 10,000
20150703_mosaic.hdr: ENVI header file for 20150703_mosaic
forestdata.txt: forest plot data, see below for detailed contents
line01_20150703atm06mFnnGeo: off-nadir image (one flight line), in the darkspot direction (angle between sensor and sun directions 76°), reflectance factor mutiplied by 10,000
line01_20150703atm06mFnnGeo.hdr: ENVI header for line01_20150703atm06mFnnGeo
line02_20150703atm06mFnnGeo: off-nadir image (one flight line), close to the hotspot direction (angle between sensor and sun directions 19°), reflectance factor mutiplied by 10,000
line02_20150703atm06mFnnGeo.hdr
markiet2017.pdf: the paper by Markiet et al. (2017) describing the multiangular data
markiet2020.pdf: the paper by Markiet and Mõttus (2020) describing the image mosaic
README.txt: this file
SPYSTUF_hyperspectral_preview.pgw: geographic information for SPYSTUF_hyperspectral_preview.png
SPYSTUF_hyperspectral_preview.png: PNG preview of the image with forest plots

Geographic projections:
All data are projected to UTM35N. See the ENVI header files for details.

Forest data in the columns of forestdata.txt:
ID: plot ID (string)
Easting_UTM35N: Easting in UTM35N projected coordinate system [units: m]
Northing_UTM35N: Northing in UTM35N projected coordinate system [m]
LAI_effective: the effective (optical) LAI of the plot as determined with LAI-2000 using a modified "VALERI cross" sampling design: eight measurement points in each cardinal direction at four and eight meters distance from the plot center point. See Majasalmi & Rautiainen (2020) for details.
LAI2000_gaps1: the mean "gaps" value for ring 1 (zenith) of LAI-2000
LAI2000_gaps2: the mean "gaps" value for ring 2 of LAI-2000
LAI2000_gaps3: the mean "gaps" value for ring 3 of LAI-2000
LAI2000_gaps4: the mean "gaps" value for ring 4 of LAI-2000
LAI2000_gaps5: the mean "gaps" value for ring 5 of LAI-2000
BA_pine: basal area of Scots pine (Pinus sylvestris) [m^2/ha]
BA_spruce: basal area of Norway spruce (Picea abies) [m^2/ha]
BA_birch: basal area of silver birch (Betula pendula) and other broadleaf species [m^2/ha]
dbh: mean diameter at breast height (1.3 m) [cm]
treeheight: mean tree height [m]
crownbase: mean height to bottom of crown (crown base) [m]

References:

Majasalmi, Titta; Rautiainen, Miina. 2020. "Dataset of tree canopy structure and variation in understory composition
in a boreal forest site" Data in Brief, 30: 105573. https://doi.org/10.1016/j.dib.2020.105573

Markiet, Vincent; Mõttus, Matti. 2020. "Estimation of boreal forest floor reflectance from airborne hyperspectral data of coniferous forests" Remote Sensing of Environment 249: 112018, DOI:10.1016/j.rse.2020.112018, https://www.sciencedirect.com/science/article/pii/S0034425720303886

Markiet, Vincent; Hernández-Clemente, Rocío; Mõttus, Matti. 2017. "Spectral Similarity and PRI Variations for a Boreal Forest Stand Using Multi-angular Airborne Imagery" Remote Sens. 9, no. 10: 1005, DOI:10.3390/rs9101005, https://www.mdpi.com/2072-4292/9/10/1005

Data license: Creative Commons Attribution 4.0 International (CC BY 4.0)

Categories:
118 Views

Pages