Datasets
Open Access
DARai: Daily Activity Recordings for AI and ML applications
- Citation Author(s):
- Submitted by:
- Ghassan AlRegib
- Last updated:
- Thu, 09/19/2024 - 10:43
- DOI:
- 10.21227/ecnr-hy49
- License:
- Categories:
- Keywords:
Abstract
The DARai dataset is a comprehensive multimodal multi-view collection designed to capture daily activities in diverse indoor environments. This dataset incorporates 20 heterogeneous modalities, including environmental sensors, biomechanical measures, and physiological signals, providing a detailed view of human interactions with their surroundings. The recorded activities cover a wide range, such as office work, household chores, personal care, and leisure activities, all set within realistic contexts. The structure of DARai features a hierarchical annotation scheme that classifies human activities into multiple levels: activities, actions, procedures, and interactions. This taxonomy aims to facilitate a thorough understanding of human behavior by capturing the variations and patterns inherent in everyday activities at different levels of granularity. DARai is intended to support research in fields such as activity recognition, contextual computing, health monitoring, and human-machine collaboration. By providing detailed, contextual, and diverse data, this dataset aims to enhance the understanding of human behavior in naturalistic settings and contribute to the development of smart technology applications.
More information : https://alregib.ece.gatech.edu/software-and-datasets/darai-daily-activit...
Access Codes: https://github.com/olivesgatech/DARai
Our dataset structure are based on data modalities as the top folder and under each modality we have Activity labels (level 1).
There will be some super group of modalities which contains at least 2 data modality folder. For some group such as bio monitors we are providing all 4 data modality under these group.
Local Dataset Path/
└── Modality/
└── Activity Label /
└── View/
└── data samples
You can download each group of modalities separatly and use them in your machine learning and deep learning pipeline.
Reassemble the zip file for Depth and IR data with following command in linux based system:
cat part_1 part_3 > yourfile_reassembled.zip
Dataset Files
- IMU.zip (35.41 MB)
- Gaze.zip (77.22 MB)
- BioMonitor.zip (1.21 GB)
- Insole Pressure.zip (5.33 GB)
- Environment.zip (304.32 kB)
- Audio.zip (18.14 MB)
- Forearm EMG.zip (513.78 MB)
- Stationary_IMU.zip (19.21 MB)
- radar.zip (4.08 GB)
- Depth_Confidence.zip (53.13 GB)
- RGB_pt1_compressed.zip (269.58 GB)
- RGB_pt2_compressed.zip (109.44 GB)
- Depth_pt_1.zip (199.00 GB)
- Depth_pt_2.zip (199.00 GB)
- Depth_pt_3.zip (157.42 GB)
- hierarchy_labels.zip (289.93 kB)
- IR.zip (463.25 GB)
- segment_data_to_L2_L3.py (3.45 kB)
- hierarchy_labels.zip (289.93 kB)
Open Access dataset files are accessible to all logged in users. Don't have a login? Create a free IEEE account. IEEE Membership is not required.