Deep Learning-based Brain Image Segmentation on Turbo Spin Echo MRI

Loading...
Thumbnail Image
Limited Access
This item is unavailable until:
2026-06-06

Date

2024

Advisors

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

6
views
0
downloads

Abstract

Purpose: Currently, the Magnetization Prepared Rapid Gradient Echo (MPRAGE) Magnetic Resonance Imaging (MRI) sequence is frequently used for brain tissue segmentation in the clinic due to its high image contrast. However, one of the limitations of the MPRAGE sequence lies in its susceptibility to metal artifacts, while the Turbo Spin Echo (TSE) sequences, can resist metal artifacts. Previous studies have shown that for patients with metal implants, metal-artifact-reduced MPRAGE images can be generated from TSE images. Conventional brain segmentation methods on MPRAGE images, such as FreeSurfer, are time-consuming. Therefore, the purpose of this study was to investigate a fast brain segmentation method via deep learning-based frameworks for patients with metal implants, using TSE images as input.Materials and Methods: A dataset consisting of 369 patients in total was used. Each patient contained 160 two-dimensional slices of T1-weighted (T1WI), T2-weighted (T2WI), and PD-weighted (PDWI) TSE brain MR images, respectively. The matrix size of the original images was 240 × 240. Two types of MPRAGE as intermediate steps were synthesized from T1WI, T2WI, and PDWI using mathematical calculations or Conditional Generative Adversarial Network (cGAN) algorithms. FreeSurfer software was used to generate brain segmentations on the MPRAGE, which were considered as the ground truth for deep-learning network training and eventual evaluation. Two research aims were investigated. Aim 1 was to utilize three-channel TSE images (T1WI, T2WI, and PDWI) to first mathematically synthesize MPRAGE images, and then perform segmentation via deep learning-based models. Aim 2 was to use single-channel TSE images as input directly or indirectly to achieve brain segmentation using deep learning-based models. Both UNet and UNet++ models were examined. The Dice coefficient was used to evaluate the performance of the above-mentioned segmentation aims. Results: For Aim 1, the Dice coefficient between the ground truth and the cortex segmentations generated by the UNet++ network using three-channel TSE images as original input and mathematically synthesized MPRAGE as direct input was 0.919 ± 0.03. For Aim 2, the Dice coefficient between the ground truth and the cortex segmentations generated by the UNet network using single-channel TSE images directly as input was 0.602 ± 0.06. The Dice coefficient between the ground truth and the cortex segmentations generated by the single-channel TSE images as original input and cGAN-synthesized MPRAGE as direct input using the UNet++ network was 0.766 ± 0.07. Conclusion: Two aims using three-channel or single-channel TSE images as original input and brain segmentation as output were investigated in this study. Three-channel TSE images as original input, and mathematically synthesized MPRAGE as direct input to the UNet++ network showed superior results. Single-channel TSE images as original input and cGAN-synthesized MPRAGE as direct input to the UNet++ network showed relatively lower performance. Further research is warranted to improve the performance of single-channel TSE-based deep-learning segmentation methods. Keywords: UNet++, MRI, Brain Image, Segmentation, TSE, MPRAGE

Description

Provenance

Citation

Citation

Zhang, Tianyi (2024). Deep Learning-based Brain Image Segmentation on Turbo Spin Echo MRI. Master's thesis, Duke University. Retrieved from https://hdl.handle.net/10161/31086.

Collections


Except where otherwise noted, student scholarship that was shared on DukeSpace after 2009 is made available to the public under a Creative Commons Attribution / Non-commercial / No derivatives (CC-BY-NC-ND) license. All rights in student work shared on DukeSpace before 2009 remain with the author and/or their designee, whose permission may be required for reuse.