Lidar
An understanding of local walking context plays an important role in the analysis of gait in humans and in the high level control systems of robotic prostheses. Laboratory analysis on its own can constrain the ability of researchers to properly assess clinical gait in patients and robotic prostheses to function well in many contexts, therefore study in diverse walking environments is warranted. A ground-truth understanding of the walking terrain is traditionally identified from simple visual data.
- Categories:
Ensuring the safe and reliable operation of autonomous vehicles under adverse weather remains a significant challenge.
To address this, we have developed a comprehensive dataset composed of sensor data acquired in a real test track and reproduced in the laboratory for the same test scenarios.
The provided dataset includes camera, radar, LiDAR, inertial measurement unit (IMU), and GPS data recorded under adverse weather conditions (rainy, night-time, and snowy conditions).
- Categories:
Evolving from the well-known ray-tracing dataset DeepMIMO, the DeepVerse 6G dataset additionally provides multi-modal sensing data generated from various emulators. These emulators provide the wireless, radar, LiDAR, vision and position data. With a parametric generator, the DeepVerse dataset can be customized by the user for various communication and sensing applications.
- Categories:
This dataset contains aerial images acquired with a medium format digital camera and point clouds collected using an airborne laser scanning (ALS) unit, as well as ground control points and direct georeferencing data. The flights were performed in 2014 over an urban area in Presidente Prudente, State of São Paulo, Brazil, using different flight heights. These flights covered several features of interest for research, including buildings of different sizes and roof materials, roads and vegetation.
- Categories: