Predicting 3-D Deformation Field Maps (DFM) based on Volumetric Cine MRI (VC-MRI) and Artificial Neural Networks for On-board 4D Target Tracking
Access is limited until:
Organ and tumor positions are constantly subject to change due to involuntary movement from the gastrointestinal and respiratory systems. In radiation therapy, accurate and precise anatomical localization is critical for treatment planning and delivery. Localization, prior to and during treatment, is most significant in stereotactic body radiation therapy (SBRT), which aims to aggressively target tumors by delivering high fractional dose to tight planning target volumes (PTV). Inter-fraction uncertainties from therapy-responding anatomical change and or patient positioning errors can be mitigated with adaptive therapy and on-board four-dimensional (4D) imaging. On the other hand, intra-fraction uncertainties from involuntary movement must be minimized by using real-time imaging. Real-time imaging enables more advanced treatment delivery techniques such as respiratory-gating and target tracking. Currently, no real-time 3-dimensional (3D) MRI tracking exist for on-board MRI-guided radiotherapy. Present MRI-guided radiotherapy machines are only capable of on-board two-dimensional (2D) cine MRI. Improving to real-time 3D MRI would provide plane-to-plane information and greatly improve target localization. The purpose of this thesis is to develop real-time 3D deformation field map (DFM) predictions using volumetric cine MRI (VC-MRI) and adaptive boosting and multi-layer perceptron neural network (ADMLP-NN) for MRI-guided 4D target tracking.
On-board VC-MRI is considered as the deformation of a prior 4D-MRI phase, MRIprior, obtained during patient simulation. The DFM that best estimates VC-MRI is constructed from a weighted linear combination of three major respiratory deformation modes extracted from principal component analysis (PCA) of DFMs between MRIprior and its remaining phases. PCA weighting coefficients are solved by the data fidelity constraint using on-board 2D cine MRI. Optimized PCA coefficients are tracked and used to train the ADMLP-NN to estimate future PCA coefficients from previous ones. ADMLP-NN uses several identical multi-layer perceptron neural networks with an adaptive boosting decision algorithm to avoid local minimums. Predicted PCA coefficients are used to build 3D DFMs for VC-MRI prediction.
This method was evaluated using a 4D computerized extended-cardiac torso (XCAT) simulation of lung cancer patients. Motion was simulated in the anterior-posterior and superior-inferior direction based on patient-specific real-position management (RPM) curve. Predicted PCA coefficient accuracy was evaluated against estimated PCA coefficients using normalized cross-correlation (NCC) and normalized root-mean-squared error (NRMSE). Predicted VC-MRIs was evaluated against ground-truth VC-MRIs using Volume Percent Difference (VPD), Volume Dice Coefficient (VDC), and Center of Mass Shift (COMS). Effects of ADMLP-NN parameter variation (number of input neurons, number of hidden neurons, number of MLP-NN, cost function threshold, prediction step size) on VC-MRI prediction accuracy were evaluated. Additionally, breathing pattern change effects between 4D MRI simulation and on-board 2D cine MRI were also evaluated.
Among all RPM signals examined, when no breathing pattern change occurred between the prior 4D MRI and on-board 2D cine MRI, the average predicted VPD, VDC, and COMS was 17.50 ± 2.85%, .92 ± .02, and 1.08 ± .44 mm. Prediction accuracy decreased when the breathing amplitude increased, but remained the same or improved when the breathing amplitude decreased between prior 4D MRI and on-board 2D cines. The feasibility and robustness of using ADMLP-NN to predict deformation fields maps for VC-MRI predictions for on-board target localization during radiotherapy treatments was demonstrated.
image-guided radiation therapy
real-time volumetric imaging
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Rights for Collection: Masters Theses