Many remote communities have extremely poor access to healthcare, even in wealthy countries like Canada. As a result, they spend up to almost half their annual healthcare budget on transporting patients to cities for treatment and diagnosis. While robotic telemedicine systems are often expensive and complex for such small communities, and video conferencing systems are inefficient and imprecise, we are developing a novel method of "teleoperating" a novice person via mixed reality to carry out medical ultrasound with tightly-coupled expert guidance. We have built a fully functioning prototype system and have characterized human performance in this mode of control, finding that it is comparable to robotic systems. Hence my research focuses on applying concepts from bilateral robotic teleoperation with time delays to this human-in-the-loop system. Additionally, we will explore AI for autonomous guidance of exams, and we are working on computer vision, HCI, high-speed communication over 5G, and instrumentation.
I presented this work to Prof. Nassir Navab's groups at TU Munich (TUM) and Johns Hopkins (JHU), as well as Prof. Emad Boctor's lab at JHU. I also presented to a panel of judges who awarded me the BC Medical Device Design Centre Innovation Prize, and to the UBC Biomedical Imaging and AI Cluster Fall Research Showcase, where I also won the 3 minute thesis award.
Finally, I shared the project at two Rogers-UBC collaboration workshops with UBC professors and Rogers experts, and applied for a grant to collaborate with and be funded by Rogers. We have now been awarded the grant as well as MITACS funding, and are working in close consultation with the Coastal First Nations and Heiltsuk Nation, as well as several sonographers, radiologists, and emergency physicians. In 2023 we were also awarded a Vancouver Coastal Health Research and Innovation grant with Dr. Silvia Chang. The project is being undertaken under the supervision of Prof. Tim Salcudean of the Robotics and Controls Lab, UBC.
D. Black, M. Nogami, S. Salcudean. "Mixed Reality Human Teleoperation with Device-Agnostic Remote Ultrasound: Communication and HCI," in Computers and Graphics, January, 2024
D. Black, D. Andjelic, S.E Salcudean. "Evaluation of Communication and Human Response Latency for (Human) Teleoperation," in IEEE Transactions on Medical Robotics. January, 2024.
D. Black, S. Salcudean. "Robust Object Pose Tracking for Augmented Reality Guidance and Teleoperation," in IEEE Transactions on Instrumentation and Measurement, Submitted August 2023, Manuscript ID TIM-23-05073.
D. Black, A.H. Hadi Hosseinabadi, M. Nogami, N. Rangga, S. Salcudean. "Towards Differential Magnetic Force Sensing for Ultrasound Teleoperation," in IEEE World Haptics Conference, July 10-13, 2023, Delft, NL.
D. Black, S. Salcudean. "Mixed Reality Human Teleoperation," in IEEE VR, March 25-29, 2023, Shanghai, China. Workshop Presentation & Paper.
D. Black, H. Moradi, S.E Salcudean. "Human-as-a-Robot Performance in Augmented Reality Teleultrasound," in International Journal of Computer Assisted Radiology and Surgery. Accepted March, 2023.
D. Black, S. Salcudean. "A Mixed Reality System for Human Teleoperation in Tele-Ultrasound," in the Hamlyn Symposium for Medical Robotics, June 26-29, 2022, London, UK. Podium Presentation.
S.E. Salcudean, H. Moradi, D. Black, N. Navab. "Robot-assisted Medical Imaging: a Review," in Proceedings of the IEEE. Vol 110, No 7. April, 2022.
D. Black, Y. Oloumi, A.H. Hadi Hosseinabadi, S. Salcudean. "Human Teleoperation - a Haptically-Enabled Mixed Reality System for Teleultrasound," Human Computer Interaction. June, 2023 (submitted July 2021).