A first step to enhance the robot-patient interaction in a Robot Assisted Ultrasound system
Author
Vives Benedicto, Laia
Term
4. semester
Education
Publication year
2024
Submitted on
2024-05-31
Pages
78
Abstract
Ultrasound exams can be physically demanding because the sonographer must press and maneuver the ultrasound probe (transducer). Hera is a robot-assisted ultrasound system developed by Life Science Robotics for obstetric scans: a robot arm holds the probe while the sonographer steers it with a joystick. Most work has focused on how clinicians control the robot; the other side—how the robot interacts with the patient to support comfort—has received little attention. This project takes a first step by detecting patient movement so the robot can adapt. We developed and tested a Kernelized Correlation Filter (KCF), a fast computer-vision tracker, to follow the patient’s abdomen (belly) in video even when the robot appears in the scene and partially blocks the view. We correct the tracker’s window using feature matching, that is, matching visual patterns between frames. When parts of the image are occluded (the view is blocked), our method is on average 9% more accurate than the default tracker. In long sequences, after losing the target due to a large occlusion, it can re-identify the patient and reach 84.5% accuracy, compared with 69.4% for the default. There are limitations: tracking the upper abdomen often fails to find enough points in the region of interest, and textureless clothing causes wrong matches because features concentrate at the skin–clothing boundary. Despite these issues, the approach shows potential for real-time abdomen tracking to estimate the patient’s position and improve robot–patient interaction. Further research is needed to find more robust features and better handle occlusions.
Ultralydsundersøgelser kan være fysisk krævende, fordi sonografen skal trykke og manøvrere ultralydsproben. Hera er et robotassisteret ultralydssystem udviklet af Life Science Robotics til obstetriske skanninger: en robotarm holder proben, mens sonografen styrer den med en joystick. Hidtil har fokus mest været på, hvordan klinikeren styrer robotten; den anden side – robot‑patient‑interaktionen og patientens komfort – er næsten ikke undersøgt. Dette projekt giver et første skridt ved at opdage patientbevægelser, så robotten kan tilpasse sig. Vi udviklede og testede en Kernelized Correlation Filter (KCF), en hurtig computersynsbaseret sporingsmetode, der følger patientens mave (abdomen) i video, også når robotten er i billedet og delvist blokerer udsynet. Sporingsboksen korrigeres med feature‑matching, dvs. ved at matche visuelle kendetegn mellem billeder. Når dele af billedet er okkluderet (synsfeltet blokeres), er metoden i gennemsnit 9% mere nøjagtig end standardtrackeren. I lange sekvenser kan den, efter at have mistet målet pga. en stor okklusion, genidentificere patienten og nå en nøjagtighed på 84,5% mod 69,4% for standarden. Der er dog begrænsninger: Sporing af den øvre del af maven finder ofte for få punkter i interesseområdet, og teksturfattigt tøj giver forkerte match, fordi punkter opdages i overgangen mellem hud og tøj. På trods af dette viser metoden potentiale til at spore maven i realtid, bestemme patientens position og forbedre robot‑patient‑interaktionen. Der er behov for videre forskning i mere robuste kendetegn og bedre håndtering af okklusioner.
[This apstract has been rewritten with the help of AI based on the project's original abstract]
