Ti3D-contact, a high-resolution and whole-body dataset of hand-object contact area based on 3D scanning method

Citation Author(s):
Zelin
Chen
Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China
Chenhao
Cao
Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China
Yiming
Ouyang
Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China
Hanlu
Chen
Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China
Hu
Jin
Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China
Shiwu
Zhang
Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China
Submitted by:
Zelin C
Last updated:
Mon, 07/15/2024 - 21:33
DOI:
10.21227/2smv-hy76
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

Hand contact data, reflecting the intricate behaviours of human hands during object operation, exhibits significant potential for analysing hand operation patterns to guide the design of hand-related sensors and robots, and predicting object properties. However, these potential applications are hindered by the constraints of low resolution and incomplete capture of the hand contact data. Leveraging a non-contact and high-precision 3D scanning method for surface capture, a high-resolution and whole-body hand contact dataset, named as Ti3D-contact, is constructed in this work. The dataset, with an average resolution of 0.72 mm, contains 1872 sets of texture images and 3D models. The contact area during hand operation is whole-body painted on gloves, which are captured as the high-resolution original hand contact data through a 3D scanner. Reliability validation on Ti3D-contact is conducted and hand movement classification with 95% precision is achieved using the acquired hand contact dataset. The properties of high-resolution and whole-body capturing make the acquired dataset exhibit a promising potential application in hand posture recognition and hand movement prediction.

Instructions: 

This work compiles a high-resolution and whole-body human hand contact dataset named as Ti3D-contact. First, participantswear cotton textile gloves to grasp and manipulate different types and sizes of objects, onto which high-adhesion paint is sprayed to paint whole-body hand contact areas on the gloves. Then, the painted gloves are scannedbya 3D scanner to capture the original hand contact data in the form of texture images and 3D models. After extracting the painted areas on the obtained 3D models in the form of point cloud, the processed hand contact data recording the contact areas between hands and objects is further obtained. A coordinate conversion method is then employed byunifying the coordinates of the processed hand contact data. Through unifying the coordinate systems, the consistency of all the hand contact data is improved,which benefits the analyses of hand operation patterns. Furthermore, a method to calculate the Euclidean distance between the adjacent points in the hand contact data is applied to obtain the resolution of the dataset, demonstrating that the high average resolution for the hand contact dataset is 0.72mm. Then, the processed hand contact data is vowelized to calculate its overlap, presenting the similarity among the 3D models for repetitive actions as 90% for verifying the reliability of the hand contact data. Finally, a classifier based on a convolutional neural network (CNN) is proposed for the classification of hand movements with 95% precision using the acquired hand contact data. This shows a potential application of the hand contact data in real-world task scenarios.Ti3D-contactcomprises1872 interactions conducted by 12 healthy adults, with 10 standardized objects, recording the entire contact areas between hands and objects with a high-resolution of 0.72 mm. A diverse array of 52 prescribed grasping and manipulation actions have been completed, including the 33 most commonly used conventional grasping gestures summarized by the Feix team and the 9 basic manipulation gestures proposed by the Elliott team. Additionally, we introduce 10 new common manipulation gestures based on the classification method pioneered by the Bullock team, enhancing the realism of task execution scenarios in daily life.