Datasets
Open Access
Urb3DCD : Urban Point Clouds Simulated Dataset for 3D Change Detection
- Citation Author(s):
- Submitted by:
- Iris de Gelis
- Last updated:
- Tue, 05/09/2023 - 08:30
- DOI:
- 10.3390/rs13132629
- Data Format:
- Links:
- License:
- Categories:
- Keywords:
Abstract
[NEW] Urb3DCD V2 is now avalaible!
In a context of rapid urban evolution, there is a need of surveying cities. Nowadays predictive models based on machine learning require large amount of data to be trained, hence the necessity of providing some public dataset allowing to follow up urban evolution. While most of changes occurs onto the vertical axis, there is no public change detection dataset composed of 3D point clouds and directly annotated according to the change at point level yet. With the proposed dataset, we aim to fill this gap since we believe that 3D point clouds bring some supplementary information on height that seems useful in the context of building change extraction, given that main modifications occur onto the vertical axis. Furthermore, spectral variability of a same object over time, difference of viewing angles between acquisition of 2D images, perspective and distortion effects could complicate change retrieval based on 2D data. Thus, this dataset is composed of bi-temporal pairs of point clouds annotated according to the change. Point clouds are acquired via a simulator of aerial LiDAR Survey over dense urban areas. Changes are also introduced by the simulator.
Different sub-datasets are made available in order to provide 3D change data in different conditions of acquisition from low resolution noisy data to higher resolution and more precise aerial LiDAR alike data. For each sub-dataset, training, validation and testing sets are furnished.
[NEW] Urb3DCD V2
A second version of this dataset is now available. We enhanced realism of our city models by adding vegetation and mobile objects (car and trucks).
Now the change annotation contains seven classes: unchanged, new building, demolition, new vegetation, vegetation growth, vegetation loss and mobile objects.
A mono-date semantic labelisation of each point clouds is now also available.
We propose two sub-datasets in this second version:
-
ALS with low resolution, low noise for both dates
-
Multi-sensor data, with low resolution, high noise at date 1, and high resolution, low noise at date 2
As for the V1, labels are given for each point along with 3D point coordinates:
-
0 unchanged
-
1 new building
-
2 demolition
-
3 new vegetation
-
4 vegetation growth
-
5 missing vegetation
-
6 mobile objects
And for the mono-date semantic:
-
0 ground
-
1 building
-
2 vegetation
-
3 mobile objects
See the corresponding publication: https://www.sciencedirect.com/science/article/pii/S0924271623000394
[NEW] Urb3DCD Cls
A third sub-dataset version has been created from the simulated data. The aim here is to propose pairs of PCs focused on mainly one type of change. The annotation is given as a function of the majority change in the pair of PCs, thereby this sub-dataset allows us to focus on the multiple change classification task. This dataset is called Urb3DCD Cls. Three splits are available for training, validation and testing steps.
Concerning the labels, each of the pairs are contained in the folder corresponding to its class:
-
0 unchanged
-
1 new building
-
2 demolition
-
3 new vegetation
-
4 missing vegetation
See the corresponding publication: https://www.sciencedirect.com/science/article/pii/S0924271623000394
Urban Point Clouds simulator
We have developed a simulator to generate time series of point clouds (PCs) for urban datasets. Given a 3D model of a city, the simulator allows us to introduce random changes in the model and generates a synthetic aerial LiDAR Survey (ALS) above the city. The 3D model is in practice issued from a real city, e.g., with a Level of Detail 2 (LoD2) precision. From this model, we extract each existing building as well as the ground. By adding or removing buildings in the model, we can simulate construction or demolition of buildings. Notice that depending on area, the ground is not necessarily flat. The simulator allows us to obtain as many 3D PCs over urban changed areas as needed. It could be useful especially for deep learning supervised approaches that require lots of training dates. Moreover, the created PCs are all directly annotated by the simulator according to the changes, thus no time-consuming manual annotation needs to be done with this process.
For each obtained model, the ALS simulation is performed thanks to a flight plan and ray tracing with the Visualisation ToolKit (VTK) python library. Space between flight lines is computed in accordance to predefined parameters such as resolution, covering between swaths and scanning angle. Following this computation, a flight plan is set with a random starting position and direction of flight in order to introduce more variability between two acquisitions. Moreover, Gaussian noise can be added to simulate errors and lack of precision in LiDAR range measuring and scan direction.
Dataset Description
To conduct fair qualitative and quantitative evaluation of PC change detection techniques, we have built some datasets based on LoD2 models of the first and second districts of Lyon (https://geo.data.gouv.fr/datasets/0731989349742867f8e659b4d70b707612bece89), France. For each simulation, buildings have been added or removed to introduce changes in the model and to generate a large number of pairs of PCs. We also consider various initial states across simulations, and randomly update the set of buildings from the first date through random addition or deletion of buildings to create the second landscape. In addition, flight starting position and direction are always set randomly. As a consequence, the acquisition patterns will not be the same between generated PCs, thus each acquisition may not have exactly the same visible or hidden parts.
From terrestrial LiDAR surveying to photogrammetric acquisition by satellite images, there exist many different types of sensors and acquisition pipelines to obtain 3D point clouds for urban areas, resulting in PCs with different characteristics. By providing different acquisition parameters to our simulator, our goal was to provide a variety of sub-datasets with heterogeneous qualities to reproduce the real variability of LiDAR sensors or to mimic datasets coming from a photogrammetric pipeline with satellite images (by using a tight scan angle with high noise). Thus, we generated the following sub-datasets:
-
ALS with low resolution, low noise for both dates
-
ALS with high resolution, low noise for both dates
-
ALS with low resolution, high noise for both dates
-
ALS with low resolution, high noise, tight scan angle (mimicking photogrammetric acquisition from satellite images) for both dates
-
Multi-sensor data, with low resolution, high noise at date 1, and high resolution, low~noise at date 2
Notice that sub-datasets 3 and 4 are quite similar, but the latter provides less visible facades, thanks to the smaller scanning angle and overlapping percentage.
Finally, for the first configuration (ALS low resolution, low noise), we provided the 3 following different training sets:
-
Small training set: 1 simulation
-
Normal training set: 10 simulations
-
Large training set: 50 simulations
More details about the configuration of acquisition are provided in the documentation file and in the publication Change Detection in Urban Point Clouds: An Experimental Comparison with Simulated 3D Datasets, de Gélis et al. (2021)
Technical details
All PCs are available at PLY format. Each train, val, test folder contains sub-folders containing pairs of PCs : pointCloud0.ply and pointCloud1.ply for both first and second dates.
Each ply file contain the coordinates X Y Z of each points and the label:
-
0 for unchanged points
-
1 for points on a new building
-
2 for for points on a destruction.
The label is given in a scalar field named label_ch. Notice that the first PC (pointCloud0.ply) has a label field even if it is set at 0 for every points because change are set in comparison to previous date.
Citation
If you use this dataset for your work, please use the following citation:
@article{degelis2021change,
title={Change Detection in Urban Point Clouds: An Experimental Comparison with Simulated 3D Datasets},
author={{de G\'elis}, I. and Lef\`evre, S. and Corpetti, T. },
journal={Remote Sensing},
volume={13},
pages={2629},
year={2021},
publisher={Multidisciplinary Digital Publishing Institute}
}
For more details: https://www.mdpi.com/2072-4292/13/13/2629
V2 or classification version:
@article{degelis2023siamese,
title = {Siamese KPConv: 3D multiple change detection from raw point clouds using deep learning},
journal = {ISPRS Journal of Photogrammetry and Remote Sensing},
volume = {197},
pages = {274-291},
year = {2023},
issn = {0924-2716},
doi = {https://doi.org/10.1016/j.isprsjprs.2023.02.001},
url = {https://www.sciencedirect.com/science/article/pii/S0924271623000394},
{Iris {de~G{\'e}lis} and S{\'e}bastien Lef{\`e}vre and Thomas Corpetti},
keywords = {3D point clouds, Change detection, Deep learning, Siamese network, 3D Kernel Point Convolution}}
For more details: https://www.sciencedirect.com/science/article/pii/S0924271623000394
Dataset Files
- Urb3DCD dataset version 1 IEEE_Dataset_V1.zip (1.02 GB)
- Urb3DCD dataset version 2 IEEE_Dataset_V2_Lid05_MS.zip (452.08 MB)
- Urb3DCD Cls Urb3DCD_cls.zip (308.91 MB)
Open Access dataset files are accessible to all logged in users. Don't have a login? Create a free IEEE account. IEEE Membership is not required.
Comments
Urb3DCD-V2 available