Development of a Method to Detect and Quantify CT Motion Artifacts: Feasibility Study

Loading...
Thumbnail Image

Date

2022

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

94
views
89
downloads

Abstract

Artifacts are known to reduce the quality of CT images and can affect statistical analysis and quantitative utility of those images. Motion artifact is a leading type of CT artifact, due to either voluntary or involuntary (respiratory and cardiac movements) causes. Currently, such artifacts, if present, are not quantified and monitored, nor are their dependencies on CT acquisition settings known. As a first step to address this gap, the aim of this study was to develop a neural network to detect and quantify motion artifacts in CT images. Training data were drawn from three sources and the pixels containing motion were segmented (Seg3D, University of Utah) and the segmentation masks used as the ground truth labels. A convolutional neural network (u-net) was trained to identify pixels containing motion. The model performance was assessed by correlating the percentage of voxels labeled as having motion in each slice of the pre-allocated testing data for the ground-truth and predicted segmentation masks, yielding a correlation coefficient of r = 0.43, as well as constructing ROC curves. A series-wise ROC curve had AUC = 0.94, and a slice-wise ROC curve had AUC = 0.80. The correlation coefficient and AUCs are expected to improve as more training data is added. This network has potential to be a useful clinical tool, enabling quality tracking systems to detect and quantify the presence of artifacts in the context of CT quality control.

Description

Provenance

Citation

Citation

Khandekar, Madhura (2022). Development of a Method to Detect and Quantify CT Motion Artifacts: Feasibility Study. Master's thesis, Duke University. Retrieved from https://hdl.handle.net/10161/25322.

Collections


Dukes student scholarship is made available to the public using a Creative Commons Attribution / Non-commercial / No derivative (CC-BY-NC-ND) license.