Human Activity Recognition (HAR) is the process of handling information from sensors and/or video capture devices under certain circumstances to correctly determine human activities. Nowadays, several simple and automatic HAR methods based on sensors and Artificial Intelligence platforms can be easily implemented.

In this challenge, participants are required to determine the nurse care daily activities by utilizing the accelerometer data collected from the smartphone, which is the cheapest and easy-to-implement way in real life.

Last Updated On: 
Sat, 05/08/2021 - 17:25
Citation Author(s): 
Sayeda Shamma Alia, Kohei Adachi, Paula Lago, Le Nhat Tan, Haru Kaneko, Sozo Inoue

The Badminton Activity Recognition (BAR) Dataset was collected for the sport of Badminton for 12 commonly played strokes. Besides the strokes, the objective of the dataset is to capture the associated leg movements.

Instructions: 

Please refer to the README file for detailed instructions.

Categories:
533 Views

Nurse Care Activity Recognition

Instructions: 

This dataset consists of two folders: training and testing

The training folder contains data collected in the lab and data from two users collected in the nursing home. They are separated in folders and there is one labels file for each.

 

The testing folder contains data for only the nursing home from the same users as the training folder but different days.

Categories:
1223 Views

Along with the increasing use of unmanned aerial vehicles (UAVs), large volumes of aerial videos have been produced. It is unrealistic for humans to screen such big data and understand their contents. Hence methodological research on the automatic understanding of UAV videos is of paramount importance.

Instructions: 

=================  Authors  ===========================

Lichao Mou,lichao.mou@dlr.de

Yuansheng Hua, yuansheng.hua@dlr.de

Pu Jin, pu.jin@tum.de

Xiao Xiang Zhu, xiaoxiang.zhu@dlr.de

 

=================  Citation  ===========================

If you use this dataset for your work, please use the following citation:

@article{eradataset,

  title= {{ERA: A dataset and deep learning benchmark for event recognition in aerial videos}},

  author= {Mou, L. and Hua, Y. and Jin, P. and Zhu, X. X.},

  journal= {IEEE Geoscience and Remote Sensing Magazine},

  year= {in press}

}

 

==================  Notice!  ===========================

This dataset is ONLY released for academic uses. Please do not further distribute the dataset on other public websites.

Categories:
596 Views

This dataset features cooking activities with recipes and gestures labeled. The data has been collected using two smartphones (right arm and left hip), two smartwatches (both wrists) and one motion capture system with 29 markers. There were 4 subjects who prepared 3 recipes (sandwich, fruit salad, cereal) 5 times each. The subjects followed a script for each recipe but acted as naturally as possible

Instructions: 

You can use our tutorials to get started https://abc-research.github.io/cook2020/tutorial/

Categories:
1096 Views