Hard dataset and Normal dataset for robotic tactile sensing
Two dataset collected by USkin tactile sensors for detecting grasping stability and slip detection during lifting objects.
For the Normal dataset:
Each grasping contains 3 folders,
1.PVT folder: including PVT points of robot arms and graping parameters/timestamp of gripper
Format of PVT.csv: Please check the first line of each pvt files
Record time range: when we start a new grasping and lifting, we recorded all the PVT points until the finish of this lifting.
2.tactile folder: including recorded tactile files
Format of tactile.csv: first row is timetamp, each column is one timestep sensor reading.
Format of sensor readings: sensor ID + axis, e.g. 005x is taxel 005, reading in x axis, each sensor patch has 24 taxel*3axis=72 readings, the first 72 readings is sensor patch one and next 72 readings is from sensor patch 2
Record time range: when we start a new grasping and lifting, we recorded all the tactile reading until the finish of this lifting.
3.image folder: including images of each grasping
We recorded videos as .bag file and crop the 224*224*3 image from necessary frames. The original .bag files is too huge and not include in this folder
Formats: 224*224*3 .png image. The name of each .png is consisted of "crop_color_" + timestamp
Record time range: after grasping object until lifting objects to about 4 cm height