Datasets
Standard Dataset
Multi-Mice PartsTrack
- Citation Author(s):
- Submitted by:
- JIANG ZHEHENG
- Last updated:
- Mon, 04/06/2020 - 12:58
- DOI:
- 10.21227/2y99-k557
- Links:
- License:
195 Views
- Categories:
- Keywords:
0 ratings - Please login to submit your rating.
Abstract
The study of mouse social behaviours has been increasingly undertaken in neuroscience research. However, automated quantification of mouse behaviours from the videos of interacting mice is still a challenging problem, where object tracking plays a key role in locating mice in their living spaces. Artificial markers are often applied for multiple mice tracking, which are intrusive and consequently interfere with the movements of mice in a dynamic environment. In this paper, we propose a novel method to continuously track several mice and individual parts without requiring any specific tagging. Firstly, we propose an efficient and robust deep learning based mouse part detection scheme to generate part candidates. Subsequently, we propose a novel Bayesian-inference Integer Linear Programming Model that jointly assigns the part candidates to individual targets with necessary geometric constraints whilst establishing pair-wise association between the detected parts. There is no publicly available dataset in the research community that provides a quantitative test-bed for the part detection and tracking of multiple mice, and we here introduce a new challenging Multi-Mice PartsTrack dataset that is made of complex behaviours and actions. Finally, we evaluate our proposed approach against several baselines on our new datasets, where the results show that our method outperforms the other state-of-the-art approaches in terms of accuracy.
Instructions:
In this paper, we introduce our new dataset for multi-mice part tracking in videos. The dataset was collected in collaboration with biologists of Queen's University Belfast, for a study of neurophysiological mechanisms involved in Parkinson's disease. In our dataset, two or three mice are interacting freely in a 50*110*30cm home cage and are recorded from the top view using a Sony Action camera (HDR-AS15) with a frame rate of 30 fps and 640 by 480 pixels VGA video resolution. All experiments are conducted in an environment-controlled room with constant temperature (27$^{\circ}$C) and light condition (long fluorescent lamp 40W). The dataset provides the detailed annotations for multiple mice in each video, as shown in Supplementary G. The mice used throughout this study were housed under constant climatic conditions with free access to food and water. All the experimental procedures were performed in accordance with the Guidance on the Operation of the Animals (Scientific Procedures) Act, 1986 (UK) and approved by the Queen’s University Belfast Animal Welfare and Ethical Review Body. Our database covers a wide range of activities like contacting, following and crossing. Moreover, our database contains a large amount of mouse appearance and mouse part occlusion. After proper training, six professionals were invited to annotate mouse heads, tail bases and localise each mouse body in the videos. We assign a unique identity to every mouse part appearing in the images. If a mouse part was in the field-of-view but became invisible due to occlusion, it is marked `occluded'. Those mouse parts outside the image border limits are not annotated. In total, our dataset yields 5 annotated videos of two mice and 5 annotated videos of three mice, and each video lasts 3 minutes. In order to evaluate the part tracking accuracy, we introduce new evaluation metrics to the proposed dataset, and also report results for several baseline methods. We split the dataset into a training and testing set with an equal duration of time and train our network based on transfer learning with pre-trained models.