USMicroMagSet

Citation Author(s):
Karim
BOTROS
INSA Centre Val de Loire
Mohammad
ALKHATIB
SIGMA
David
FOLIO
INSA Centre Val de Loire
Antoine
FERREIRA
INSA Centre Val de Loire
Submitted by:
Karim Botros
Last updated:
Tue, 05/02/2023 - 07:40
DOI:
10.21227/1dsz-da61
Data Format:
Research Article Link:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

We have utilized Ultrasound  (US) B-mode imaging to record single agents and collective swarms of microrobots in controlled experimental conditions.

Positioning errors are caused by inaccuracies in US-based visual feedback provided by detection and tracking algorithms. Application of deep learning networks is a promising solution for real-time detection and tracking of microrobots in noisy ultrasound images. Most prominent, however, are the performance gaps in state-of-the-art deep learning detection and tracking studies of microrobots. A key factor in this is the lack of availability of large data sets and benchmarks. In this work, we present the first published B-mode ultrasound microrobot dataset ($USmicroMagSet$) with precise annotations, containing over 40000 magnetic microrobot samples. In addition, we analyze the performance of the microrobots included in the proposed benchmark data set using four deep learning detectors and four deep learning trackers. 

 

Instructions: 

We have joined a python module named USMMgSt, that enable users to view read and manipulate the dataset

 

For attribution, please cite this work as:

 Botros K., Mohammad A., Folio D., and Ferreira A., “USMicroMagSet: Using Deep Learning Analysis to Benchmark the Performance of Microrobots in Ultrasound Images,” Robot. Autom. Lett., pp. 1–8, 2023. doi: [10.1109/10.1109/LRA.2023.3264746](https://doi.org/10.1109/LRA.2023.3264746)

 

Funding Agency: 
Region Centre Val de Loire and the BUBBLEBOT project

Documentation

AttachmentSize
File README.md4.14 KB