Hard dataset and Normal dataset for robotic tactile sensing

Citation Author(s):
Gang
Yan
Submitted by:
Gang Yan
Last updated:
Tue, 05/17/2022 - 22:18
DOI:
10.21227/94km-0873
Data Format:
Research Article Link:
License:
551 Views
Categories:
0
0 ratings - Please login to submit your rating.

Abstract 

Two dataset collected by USkin tactile sensors for detecting grasping stability and slip detection during lifting objects. 

Instructions: 

For the Normal dataset:

Each grasping contains 3 folders, 

1.PVT folder: including PVT points of robot arms and graping parameters/timestamp of gripper

Format of PVT.csv: Please check the first line of each pvt files

Record time range: when we start a new grasping and lifting, we recorded all the PVT points until the finish of this lifting.

2.tactile folder: including recorded tactile files

Format of tactile.csv: first row is timetamp, each column is one timestep sensor reading.

Format of sensor readings: sensor ID + axis, e.g. 005x is taxel 005, reading in x axis, each sensor patch has 24 taxel*3axis=72 readings, the first 72 readings is sensor patch one and next 72 readings is from sensor patch 2  

Record time range: when we start a new grasping and lifting, we recorded all the tactile reading until the finish of this lifting.

 3.image folder: including images of each grasping

We recorded videos as .bag file and crop the 224*224*3 image from necessary frames. The original .bag files is too huge and not include in this folder 

Formats: 224*224*3 .png image. The name of each .png is consisted of "crop_color_" + timestamp

Record time range: after grasping object until lifting objects to about 4 cm height 

 

For the details of dataset or you wanna use our dataset in your research, feel free to contact with yangang@fuji.waseda.jp/yg19930918@gmail.com in English/Chinese/Japanese! 

 

Documentation