Deep Learning-based Onboard Image Guidance and Dose Verification for Radiation Therapy

Loading...
Thumbnail Image
Limited Access
This item is unavailable until:
2026-06-06

Date

2024

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

14
views
0
downloads

Abstract

Onboard image guidance and dose verification play important roles in radiation therapy, which enables precise targeting and accurate dose delivery. However, clinical utilities of these advanced techniques are limited by their degraded image quality due to the under-sampling. Specifically, four-dimensional cone-beam computed tomography (4D-CBCT) is a valuable tool to provide onboard respiration-resolved images for moving targets, but its image quality is degraded by the intra-phase sparse sampling due to the clinical constraints on acquisition time and imaging dose. Radiation-induced acoustic (RA) imaging and prompt gamma (PG) imaging are two promising methods to reconstruct 3D dose deposition noninvasively in real-time during the treatment, but their images are severely distorted by the single-view measurements. Essentially, reconstructing images from under-sampled acquisition is an ill-conditioned inverse problem. Our previous studies have demonstrated the effectiveness of deep learning in restoring volumetric information from sparse and limited-angle measurements. In this project, we would like to further explore the applications of deep learning (1) in providing high-quality and efficient onboard image guidance before dose delivery for target localization, and (2) in realizing precise quantitative 3D dosimetry during delivery for dose verification in the radiotherapy. The first aim is achieved by reconstructing high-quality 4D-CBCT from fast (1-minute) free-breathing scans. We proposed a feature-compensated deformable convolutional network (FeaCo-DCN) to perform inter-phase compensation in the latent feature space, which has not been explored by previous studies. In FeaCo-DCN, encoding networks extract features from each phase, and then, features of other phases are deformed to those of the target phase via deformable convolutional networks. Finally, a decoding network combines and decodes features from all phases to yield high-quality images of the target phase. The proposed FeaCo-DCN was evaluated using lung cancer patient data. Key findings include (1) FeaCo-DCN generated high-quality images with accurate and clear structures for a fast 4D-CBCT scan; (2) 4D-CBCT images reconstructed by FeaCo-DCN achieved 3D tumor localization accuracy within 2.5 mm; (3) image reconstruction is nearly real-time; and (4) FeaCo-DCN achieved superior performance by all metrics compared to the top-ranked techniques in the AAPM SPARE Challenge. In conclusion, the proposed FeaCo-DCN is effective and efficient in reconstructing 4D-CBCT while reducing about 90% of the scanning time, which can be highly valuable for moving target localization in image-guided radiotherapy. The second aim is achieved by reconstructing accurate dose maps from multi-modality radiation-induced signals such as (a) acoustic waves and (b) prompt gammas. For protoacoustic (PA) imaging, we developed a deep learning-based method to address the limited-view issue in the PA reconstruction. A deep cascaded convolutional neural network (DC-CNN) was proposed to reconstruct 3D high-quality radiation-induced pressures using PA signals detected by a matrix array, and then derive precise 3D dosimetry from pressures for dose verification in proton therapy. To validate its performance, we collected 81 prostate cancer patients’ proton therapy treatment plans. The proton-acoustic simulation was performed using the open-source k-wave package. A matrix ultrasound array was simulated near the perineum to acquire radiofrequency (RF) signals during dose delivery. For realistic acoustic simulations, tissue heterogeneity and attenuation were considered, and Gaussian white noise was added to the acquired RF signals. The proposed DC-CNN was trained on 204 samples from 69 patients and tested on 26 samples from 12 other patients. Results demonstrated that the proposed method considerably improved the limited-view proton-acoustic image quality, reconstructing pressures with clear and accurate structures and deriving doses with a high agreement with the ground truth. Quantitatively, the pressure accuracy achieved an RMSE of 0.061, and the dose accuracy achieved an RMSE of 0.044, GI (3%/3mm) of 93.71%, and 90%-isodose line Dice of 0.922. The proposed method demonstrates the feasibility of achieving high-quality quantitative 3D dosimetry in proton-acoustic imaging using a matrix array, which potentially enables the online 3D dose verification for prostate proton therapy. Besides the limited-angle acquisition challenge in acoustic imaging, we also developed a general deep inception convolutional neural network (GDI-CNN) to address the low SNR challenge in the few-frame-averaged acoustic signals. The network employs convolutions with multiple dilations in each inception block, allowing it to encode and decode signal features with varying temporal characteristics. This design generalizes GDI-CNN to denoise acoustic signals resulting from different radiation sources. The performance of the proposed method was evaluated using experimental data of X-ray-induced acoustic and protoacoustic signals both qualitatively and quantitatively. Results demonstrated the effectiveness of GDI-CNN: it achieved X-ray-induced acoustic image quality comparable to 750-frame-averaged results using only 10-frame-averaged measurements, reducing the imaging dose of X-ray-acoustic computed tomography (XACT) by 98.7%; it realized proton range accuracy parallel to 1500-frame-averaged results using only 20-frame-averaged measurements, improving the range verification frequency in proton therapy from 0.5Hz to 37.5Hz. Compared to lowpass filter-based denoising, the proposed method demonstrated considerably lower mean-squared-errors, higher peak-SNR, and higher structural similarities with respect to the corresponding high-frame-averaged measurements. The proposed deep learning-based denoising framework is a generalized method for few-frame-averaged acoustic signal denoising, which significantly improves the RA imaging’s clinical utilities for low-dose imaging and real-time therapy monitoring. For prompt gamma imaging, we proposed a two-tier deep learning-based method with a novel weighted axis-projection loss to generate precise 3D PG images to achieve accurate proton range verification. The proposed method consists of two models: first, a localization model is trained to define a region-of-interest (ROI) in the distorted back-projected PG image that contains the proton pencil beam; second, an enhancement model is trained to restore the true PG emissions with additional attention on the ROI. In this study, we simulated 54 proton pencil beams delivered at clinical dose rates in a tissue-equivalent phantom using Monte-Carlo (MC). PG detection with a CC was simulated using the MC-Plus-Detector-Effects model. Images were reconstructed using the kernel-weighted-back-projection algorithm, and were then enhanced by the proposed method. The method effectively restored the 3D shape of the PG images with the proton pencil beam range clearly visible in all testing cases. Range errors were within 2 pixels (4 mm) in all directions in most cases at a higher dose level. The proposed method is fully automatic, and the enhancement takes only ~0.26 seconds. This preliminary study demonstrated the feasibility of the proposed method to generate accurate 3D PG images using a deep learning framework, providing a powerful tool for high-precision in vivo range verification of proton therapy. These applications can significantly reduce the uncertainties in patient positioning and dose delivery in radiotherapy, which improves treatment precision and outcomes.

Description

Provenance

Citation

Citation

Jiang, Zhuoran (2024). Deep Learning-based Onboard Image Guidance and Dose Verification for Radiation Therapy. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/30819.

Collections


Dukes student scholarship is made available to the public using a Creative Commons Attribution / Non-commercial / No derivative (CC-BY-NC-ND) license.