This dataset  provide researchers a benchmark to develop applicable and adaptive harbor detection algorithms.

Categories:
135 Views

SI-STSAR-7 is a labeled spatiotemporal dataset for sea ice classification based on SAR images. The dataset is produced from 80 Sentinel-1 A/B SAR scenes during the two freezing periods of Hudson Bay from October 2019 to May 2020 and from October 2020 to April 2021, which are provided by the Copernicus Open Access Center. The Sentinel-1 SAR images were preprocessed with noise reduction and incidence angle dependence correction before use. 

Instructions: 

Acknowledgement

SI-STSAR-7 is based on Copernicus Sentinel-1 data provided by European Commission and European Space Agency (ESA) and weekly regional ice charts and weekly regional ice data provided by Canadian Ice Service (CIS).

  1. If we infringe on the rights of European Commission, ESA or CIS, please contact us immediately and we will remove it in time.
  2. The ownership and copyright of the SI-STSAR-7 dataset belong to Shanghai Ocean University (SHOU). The dataset is distributed freely, but those who use the dataset must also comply with the relevant data usage agreements of the European Commission, ESA and CIS.
  3. Please cite the DOI: 10.21227/d6kp-s174 if you use this dataset in any form in publications. You may not redistribute our material without our written permission.
Categories:
191 Views

<p>This is the image dataset for satellite image processing&nbsp; which is a collection therml infrared and multispectral images .</p>

Instructions: 

Dataset images
Thermal infrared images and multispectral images
image size:512x512
format:
image:.tiff
file :.h5

Categories:
800 Views

In a context of rapid urban evolution, there is a need of surveying cities. Nowadays predictive models based on machine learning require large amount of data to be trained, hence the necessity of providing some public dataset allowing to follow up urban evolution. While most of changes occurs onto the vertical axis, there is no public change detection dataset composed of 3D point clouds and directly annotated according to the change at point level yet.

Instructions: 

Urban Point Clouds simulator

We have developed a simulator to generate time series of point clouds (PCs) for urban datasets. Given a 3D model of a city, the simulator allows us to introduce random changes in the model and generates a synthetic aerial LiDAR Survey (ALS) above the city. The 3D model is in practice issued from a real city, e.g. with a Level of Detail 2 (LoD2) precision. From this model, we extract each existing building as well as the ground. By adding or removing buildings in the model, we can simulate construction or demolition of buildings. Notice that depending on area the ground is not necessarily flat. The simulator allows us to obtain as many 3D PCs over urban changed areas as needed. It could be useful especially for deep learning supervised approaches that require lots of training dates. Moreover, the created PCs are all directly annotated by the simulator according to the changes, thus no time-consuming manual annotation needs to be done with this process.

For each obtained model, the ALS simulation is performed thanks to a flight plan and ray tracing with the Visualisation ToolKit (VTK) python library.  Space between flight lines is computed in accordance to predefined parameters such as resolution, covering between swaths and scanning angle. Following this computation, a flight plan is set with a random starting position and direction of flight in order to introduce more variability between two acquisitions. Moreover, Gaussian noise can be added to simulate errors and lack of precision in LiDAR range measuring and scan direction.

Dataset Description

To conduct fair qualitative and quantitative evaluation of PC change detection techniques, we have build some datasets based on LoD2 models of the first and second districts of Lyon  (https://geo.data.gouv.fr/datasets/0731989349742867f8e659b4d70b707612bece89), France. For each simulation, buildings have been added or removed to introduce changes in the model and to generate a large number of pairs of PCs. We also consider various initial states across simulations, and randomly update the set of buildings from the first date through random addition or deletion of  buildings to create the second landscape. In addition, flight starting position and direction are always set randomly. As a consequence, the acquisition patterns will not be the same between generated PCs, thus each acquisition may not have exactly the same visible or hidden parts.

From terrestrial LiDAR surveying to photogrammetric acquisition by satellite images, there exist many different types of sensors and acquisition pipelines to obtain 3D point clouds for urban areas, resulting in PCs with different characteristics. {By providing different acquisition parameters to our simulator}, our goal was to provide a variety of sub-datasets with heterogeneous qualities to reproduce the real variability of LiDAR sensors or to mimic datasets coming from a photogrammetric pipeline with satellite images (by using a tight scan angle with high noise). Thus, we generated the following sub-datasets:

  • ALS with low resolution, low noise for both dates
  • ALS with high resolution, low noise for both dates
  • ALS with low resolution, high noise for both dates
  • ALS with low resolution, high noise, tight scan angle (mimicking photogrammetric acquisition from satellite images) for both dates
  • Multi-sensor data, with low resolution, high noise at date 1, and high resolution, low~noise at date 2

Notice that sub-datasets 3 and 4 are quite similar but the latter provides less visible facades, thanks to the smaller scanning angle and overlapping percentage.

Finally, for the first configuration (ALS low resolution, low noise), we provided the 3~following different training sets:

  • Small training set: 1 simulation
  • Normal training set: 10 simulations
  • Large training set: 50 simulations

More details about the configuration of acquisition are provided in the documentation file and in the publication Change Detection in Urban Point Clouds: An Experimental Comparison with Simulated 3D Datasets, de Gélis et al. (2021)

Technical details

All PCs are available at PLY format. Each train, val, test folder contains sub-folders containing pairs of PCs : pointCloud0.ply and pointCloud1.ply for both first and second dates.

Each ply file contain the coordinates X Y Z of each points and the label:

  • 0 for unchanged points
  • 1 for points on a new building
  • 2 for for points on a destruction.

The label is given in a scalar field named label_ch. Notice that the first PC (pointCloud0.ply) has a label field even if it is set at 0 for every points because change are set in comparison to previous date.

Citation

If you use this dataset for your work, please use the following citation:

@article{degelis2021change,  title={Change Detection in Urban Point Clouds: An Experimental Comparison with Simulated 3D Datasets},  author={{de G\'elis}, I. and Lef\`evre, S. and Corpetti, T. },  journal={Remote Sensing},  volume={13},  pages={2629},  year={2021},  publisher={Multidisciplinary Digital Publishing Institute}}

For more details : https://www.mdpi.com/2072-4292/13/13/2629

Categories:
462 Views

The files here support the analysis presented in the Comment on “Study of Systematic Bias in Measuring Surface Deformation With SAR Interferometry” by Ansari et al. (2021), published in IEEE Transactions on Geoscience and Remote Sensing [1]. In particular, we provide in the following the instructions to access the multilook interferogram sequences exploited in our Comment and their ancillary information.

 

This dataset contains modified Copernicus Sentinel data [2021].

 

Instructions: 

The overall dataset is composed of three multilook interferometric sequences retrieved by processing, through the P-SBAS processing chain described in [2], the 230 Sentinel-1 images acquired from ascending orbits (track 44) over Sicily (Southern Italy) between May 2016 and May 2020. 

The interferometric datasets, referred hereafter to as “short-time”, “medium-time” and “long-time”, respectively, are encapsulated within a different zip file. Instructions to manage the datasets are provided in the pdf attached file.

Categories:
304 Views

The InSAR processing result,GPS result,gray model predicted result and Gray-Markov model predicted result 

Categories:
22 Views

N

TBD

Categories:
69 Views

The SPYSTUF hyperspectral data contains high spatial and spectral resolution Aisa Eagle II (visible to near infrared, 400-900 nm) airborne imaging spectrometer above a Hyytiälä forest research station hosting the SMEAR II (Station for Measuring Ecosystem-Atmosphere Relations, 61°50' N, 24°17' E) on 3 July 2015. The spectral resolution of the data is 4.6 nm, and the spatial resolution 0.6 m.

Instructions: 

SPYSTUF hyperspectral data

Authors:
Matti Mõttus, Vincent Markiet, Rocío Hernández-Clemente, Viljami Perheentupa, Titta Majasalmi

The SPYSTUF hyperspectral data contains high spatial and spectral resolution Aisa Eagle II (visible to near infrared, 400-900 nm) airborne imaging spectrometer above a Hyytiälä forest research station hosting the SMEAR II (Station for Measuring Ecosystem-Atmosphere Relations, 61°50' N, 24°17' E) on 3 July 2015. The spectral resolution of the data is 4.6 nm, and the spatial resolution 0.6 m. The data are partly multiangular with the sensor tilted 30° off-nadir for two flight lines, resulting in measurements with the angle between the directions to the sensor and the sun of 19° (closest to hotspot), 55° (nadir) and 76° (dark spot). The data are processed to top-of-canopy geolocated reflectance factors and mosaicked. All mosaicked data were collected with the sensor pointing approximately nadir. The hyperspectral imagery is accompanied by data on basic forest variables and optical LAI from 20 plots in the image area, determined within approx. one week around the airborne acquisition.

The data were obtained between 10:44 and 12:20 (GMT+3) at approximately 1 km altitude above the ground with flight lines consecutively in the northwestern and southeastern directions to minimize BRF effects. The Aisa Eagle II sensor had a field of view (FOV) of 37.5° divided between 1024 pixels. The average solar zenith angle was 48°, the photosynthetic photon flux density ranged from 1285 to 1493 μmol/(m^2*s) with a mean value of 1408 μmol/(m^2*s) (SMEAR II measurement data above the forest). The weather conditions were optimal for an airborne hyperspectral acquisition with a clear blue sky.

The collection and processing of the dataset was largely funded by the Academy of Finland (project SPYSTUF, PI M. Mõttus, grants 266152, 272989 and 303633). All authors were affiliated with the University of Helsinki, Finland, during the data acquisition. Data processing was mostly performed by Vincent Markiet and Matti Mõttus at VTT Technical Research Centre of Finland.

The multiangular data are described in detail in the publication (open access) by Markiet et al. (2017)
The mosaic is described in the publication (open access) by Markiet & Mõttus (2020)

Additional data on the imaged forests are available from external sources, e.g.

* SMEAR II weather and flux data are available via the SmartSMEAR system: https://smear.avaa.csc.fi

* USGS provides EO-1 Hyperion imagery coincident with the airborne data, centered on SMEAR II: https://earthexplorer.usgs.gov
REQUEST_ID = "1890172015184_20001"
ACQUISITION_DATE = 2015-07-03
START_TIME = 2015 184 08:26:46
END_TIME = 2015 184 08:31:05

* Dataset of tree canopy structure and understory composition obtained two years earlier, https://data.mendeley.com/datasets/dyt4nkp583/1
Majasalmi, T., & Rautiainen, M. (2020). Dataset of tree canopy structure and variation in understory composition
in a boreal forest site. Data in Brief, 30, [105573]. https://doi.org/10.1016/j.dib.2020.105573
Data identification number: 10.17632/dyt4nkp583.1

Files in the project:

20150703_mosaic: The hyperspectral mosaic data (BSQ format, 16778 samples, 16255 lines, 128 bands, 16-bit signed integer), reflectance factor mutiplied by 10,000
20150703_mosaic.hdr: ENVI header file for 20150703_mosaic
forestdata.txt: forest plot data, see below for detailed contents
line01_20150703atm06mFnnGeo: off-nadir image (one flight line), in the darkspot direction (angle between sensor and sun directions 76°), reflectance factor mutiplied by 10,000
line01_20150703atm06mFnnGeo.hdr: ENVI header for line01_20150703atm06mFnnGeo
line02_20150703atm06mFnnGeo: off-nadir image (one flight line), close to the hotspot direction (angle between sensor and sun directions 19°), reflectance factor mutiplied by 10,000
line02_20150703atm06mFnnGeo.hdr
markiet2017.pdf: the paper by Markiet et al. (2017) describing the multiangular data
markiet2020.pdf: the paper by Markiet and Mõttus (2020) describing the image mosaic
README.txt: this file
SPYSTUF_hyperspectral_preview.pgw: geographic information for SPYSTUF_hyperspectral_preview.png
SPYSTUF_hyperspectral_preview.png: PNG preview of the image with forest plots

Geographic projections:
All data are projected to UTM35N. See the ENVI header files for details.

Forest data in the columns of forestdata.txt:
ID: plot ID (string)
Easting_UTM35N: Easting in UTM35N projected coordinate system [units: m]
Northing_UTM35N: Northing in UTM35N projected coordinate system [m]
LAI_effective: the effective (optical) LAI of the plot as determined with LAI-2000 using a modified "VALERI cross" sampling design: eight measurement points in each cardinal direction at four and eight meters distance from the plot center point. See Majasalmi & Rautiainen (2020) for details.
LAI2000_gaps1: the mean "gaps" value for ring 1 (zenith) of LAI-2000
LAI2000_gaps2: the mean "gaps" value for ring 2 of LAI-2000
LAI2000_gaps3: the mean "gaps" value for ring 3 of LAI-2000
LAI2000_gaps4: the mean "gaps" value for ring 4 of LAI-2000
LAI2000_gaps5: the mean "gaps" value for ring 5 of LAI-2000
BA_pine: basal area of Scots pine (Pinus sylvestris) [m^2/ha]
BA_spruce: basal area of Norway spruce (Picea abies) [m^2/ha]
BA_birch: basal area of silver birch (Betula pendula) and other broadleaf species [m^2/ha]
dbh: mean diameter at breast height (1.3 m) [cm]
treeheight: mean tree height [m]
crownbase: mean height to bottom of crown (crown base) [m]

References:

Majasalmi, Titta; Rautiainen, Miina. 2020. "Dataset of tree canopy structure and variation in understory composition
in a boreal forest site" Data in Brief, 30: 105573. https://doi.org/10.1016/j.dib.2020.105573

Markiet, Vincent; Mõttus, Matti. 2020. "Estimation of boreal forest floor reflectance from airborne hyperspectral data of coniferous forests" Remote Sensing of Environment 249: 112018, DOI:10.1016/j.rse.2020.112018, https://www.sciencedirect.com/science/article/pii/S0034425720303886

Markiet, Vincent; Hernández-Clemente, Rocío; Mõttus, Matti. 2017. "Spectral Similarity and PRI Variations for a Boreal Forest Stand Using Multi-angular Airborne Imagery" Remote Sens. 9, no. 10: 1005, DOI:10.3390/rs9101005, https://www.mdpi.com/2072-4292/9/10/1005

Data license: Creative Commons Attribution 4.0 International (CC BY 4.0)

Categories:
292 Views

WITH the advancement in sensor technology, huge amounts of data are being collected from various satellites. Hence, the task of target-based data retrieval and acquisition has become exceedingly challenging. Existing satellites essentially scan a vast overlapping region of the Earth using various sensing techniques, like multi-spectral, hyperspectral, Synthetic Aperture Radar (SAR), video, and compressed sensing, to name a few.

Instructions: 

A Zero-Shot Sketch-based Inter-Modal Object Retrieval Scheme for Remote Sensing Images

Email the authors at ushasi@iitb.ac.in for any query.

 

Classes in this dataset:

Airplane

Baseball Diamond

Buildings

Freeway

Golf Course

Harbor

Intersection

Mobile home park

Overpass

Parking lot

River

Runway

Storage tank

Tennis court

Paper

The paper is also available on ArXiv: A Zero-Shot Sketch-based Inter-Modal Object Retrieval Scheme for Remote Sensing Images

 

Feel free to cite the author, if the work is any help to you:

 

``` @InProceedings{Chaudhuri_2020_EoC, author = {Chaudhuri, Ushasi and Banerjee, Biplab and Bhattacharya, Avik and Datcu, Mihai}, title = {A Zero-Shot Sketch-based Inter-Modal Object Retrieval Scheme for Remote Sensing Images}, booktitle = {http://arxiv.org/abs/2008.05225}, month = {Aug}, year = {2020} }

 

Categories:
422 Views

WITH the advancement in sensor technology, huge amounts of data are being collected from various satellites. Hence, the task of target-based data retrieval and acquisition has become exceedingly challenging. Existing satellites essentially scan a vast overlapping region of the Earth using various sensing techniques, like multi-spectral, hyperspectral, Synthetic Aperture Radar (SAR), video, and compressed sensing, to name a few.

Categories:
103 Views

Pages