Human action recognition

With multiple large open source datasets, the development of action recognition is rapid. However, we noticed the lack of annotated data of cilvil aircraft pilots, while distribution of whose action can be very different from daily casual activities. After discussion with experienced pilots and experts and close look into standard operation procedure, we present Airline-Pilot-Action (APA) benchmark, containing 5090 RGB and depth images together with corresponding flight computer data.

Categories:
142 Views

This dataset consists of inertial, force, color, and LiDAR data collected from a novel sensor system. The system comprises three Inertial Measurement Units (IMUs) positioned on the waist and atop each foot, a color sensor on each outer foot, a LiDAR on the back of each shank, and a custom Force-Sensing Resistor (FSR) insole featuring 13 FSRs in each shoe. 20 participants wore this sensor system whilst performing 38 combinations of 11 activities on 9 different terrains, totaling over 7.8 hours of data.

Categories:
218 Views

This dataset was collected in our lab using Kinect to emphasize three points: (1) Larger number of human activities. (2) Each subject performed all actions in a continuous manner with no breaks or pauses. Therefore, the start and end positions of body for the same actions are different. (3) Each subject performed the same actions four times while imaged from four different views: front view, left and right side views, and top view.

Categories:
1111 Views