3 years ago

Multi‐needle Localization with Attention U‐Net in US‐guided HDR Prostate Brachytherapy

Yupei Zhang, Yang Lei, Richard L.J. Qiu, Tonghe Wang, Hesheng Wang, Ashesh B. Jani, Walter J. Curran, Pretesh Patel, Tian Liu, Xiaofeng Yang


Ultrasound (US)‐guided high‐dose‐rate (HDR) prostate brachytherapy requests the clinicians to place HDR needles (catheters) into the prostate gland under transrectal US (TRUS) guidance in the operating room. The quality of the subsequent radiation treatment plan is largely dictated by the needle placements, which varies upon the experience level of the clinicians and the procedure protocols. Real‐time plan dose distribution, if available, could be a vital tool to provide more subjective assessment of the needle placements, hence potentially improving the radiation plan quality and the treatment outcome. However, due to low signal‐to‐noise ratio (SNR) in US imaging, real‐time multi‐needle segmentation in 3D TRUS, which is the major obstacle for real‐time dose mapping, has not been realized to this date. In this study, we propose a deep‐learning‐based method that enables accurate and real‐time digitization of the multiple needles in the 3D TRUS images of HDR prostate brachytherapy.


A deep‐learning model based on the U‐Net architecture was developed to segment multiple needles in the 3D TRUS images. Attention gates were considered in our model to improve the prediction on the small needle points. Furthermore, the spatial continuity of needles was encoded into our model with total variation (TV) regularization. The combined network was trained on 3D TRUS patches with the deep supervision strategy, where the binary needle annotation images were provided as ground truth. The trained network was then used to localize and segment the HDR needles for a new patient' TRUS images. We evaluated our proposed method based on the needle shaft and tip errors against manually defined ground truth and compared our method with other state‐of‐art methods (U‐Net and deeply supervised attention U‐Net).


Our method detected 96% needles of 339 needles from 23 HDR prostate brachytherapy patients with 0.290±0.236 mm at shaft error and 0.442±0.831 mm at tip error. For shaft localization, our method resulted in 96% localizations with less than 0.8 mm error (needle diameter is 1.67 mm), while for tip localization, our method resulted in 75% needles with 0 mm error and 21% needles with 2 mm error (TRUS image slice thickness is 2 mm). No significant difference is observed (p=0.83) on tip localization between our results with the ground truth. Compared with U‐Net and deeply supervised attention U‐Net, the proposed method delivers a significant improvement on both shaft error and tip error (p<0.05).


We proposed a new segmentation method to precisely localize the tips and shafts of multiple needles in 3D TRUS images of HDR prostate brachytherapy. The 3D rendering of the needles could help clinicians to evaluate the needle placements. It paves the way for the development of real‐time plan dose assessment tools that can further elevate the quality and outcome of HDR prostate brachytherapy.

You might also like
Discover & Discuss Important Research

Keeping up-to-date with research can feel impossible, with papers being published faster than you'll ever be able to read them. That's where Researcher comes in: we're simplifying discovery and making important discussions happen. With over 19,000 sources, including peer-reviewed journals, preprints, blogs, universities, podcasts and Live events across 10 research areas, you'll never miss what's important to you. It's like social media, but better. Oh, and we should mention - it's free.

  • Download from Google Play
  • Download from App Store
  • Download from AppInChina

Researcher displays publicly available abstracts and doesn’t host any full article content. If the content is open access, we will direct clicks from the abstracts to the publisher website and display the PDF copy on our platform. Clicks to view the full text will be directed to the publisher website, where only users with subscriptions or access through their institution are able to view the full article.