We have utilized Ultrasound (US) B-mode imaging to record single agents and collective swarms of microrobots in controlled experimental conditions.
Positioning errors are caused by inaccuracies in US-based visual feedback provided by detection and tracking algorithms. Application of deep learning networks is a promising solution for real-time detection and tracking of microrobots in noisy ultrasound images. Most prominent, however, are the performance gaps in state-of-the-art deep learning detection and tracking studies of microrobots. A key factor in this is the lack of availability of large data sets and benchmarks. In this work, we present the first published B-mode ultrasound microrobot dataset ($USmicroMagSet$) with precise annotations, containing over 40000 magnetic microrobot samples. In addition, we analyze the performance of the microrobots included in the proposed benchmark data set using four deep learning detectors and four deep learning trackers.
We have joined a python module named USMMgSt, that enable users to view read and manipulate the dataset
For attribution, please cite this work as:
Botros K., Mohammad A., Folio D., and Ferreira A., “USMicroMagSet: Using Deep Learning Analysis to Benchmark the Performance of Microrobots in Ultrasound Images,” Robot. Autom. Lett., pp. 1–8, 2023. doi: [10.1109/10.1109/LRA.2023.3264746](https://doi.org/10.1109/LRA.2023.3264746)