FIR Human

Citation Author(s):
Jesus
Gutierrez-Gallego
UNED - Universidad Nacional de Educación a Distancia
Victor
Rodriguez-Ontiveros
EduQTech - Universidad de Zaragoza
Sergio
Martin
UNED - Universidad Nacional de Educación a Distancia
Submitted by:
Sergio Martin
Last updated:
Tue, 09/12/2023 - 13:30
DOI:
10.21227/vey5-hd20
Research Article Link:
Links:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

This dataset contains video-clips of five volunteers developing daily life activities. Each video-clip is recorded with a Far InfraRed (FIR) camera and includes an associated file which contains the three-dimensional and two-dimensional coordinates of the main body joints in each frame of the clip. This way, it is possible to train human pose estimation networks using FIR imagery.

It contains over 250.000 2D and 3D human poses and their corresponding FIR images. The dataset is recorded by 5 volunteers (4 males, 1 female) engaged in 27 different action classes, including falls of different kinds.

The FIR video-clips are recorded at 23.98 frames per second with a resolution of 480 x 640 pixels. The annotations associated with them include accurate 3D positions of the 19 main body joints provided by a high-speed motion capture system and their projections onto the image plane.

Instructions: 

The dataset contains 27 action classes in total. 26 of them are daily life activities while the

other one includes different types of falls. The different actions are repeated by the volunteers

in four different positions so frontal, rear and side views of the same actions are recorded.

The dataset is divided into three blocks. The first block, which includes the motions of

four volunteers, is used for system training and, in this group, all volunteers are recorded

executing 13 daily life activities. These actions include:

1. Giving directions.

2. Discussing.

3. Eating.

4. Taking photos.

5. Exercising on the ground.

6. Running in place.

7. Walking.

8. Sitting and standing up.

9. Coughing.

10. Exercising.

11. Playing basketball.

12. Picking up objects.

13. Limping.

The second block includes a single person who executes a different set of actions with

validation purposes. These activities include:

1. Brushing teeth.

2. Encouraging your team.

3. Toasting.

4. Taking a selfie.

5. Crouching for meditation.

6. Walking a dog.

7. Throwing a stone.

8. Talking on the phone.

9. Stretching yourself.

10. Hopping.

11. Kicking a ball.

12. Tying shoelaces.

13. Rotating your trunk.

Finally, the third block includes four volunteers who are recorded from different

perspectives falling forward, falling backwards and side falling. The falls start from static or

dynamic situations and a number of them are slow falls, a common type of fall in the elderly

community.

Comments

 Can I acces teh data for research

Submitted by Hatem Ibrahem on Fri, 08/30/2024 - 18:28

Documentation

AttachmentSize
File FIR-Human additional information.pdf572.48 KB