Material decomposition from photon-counting CT using a convolutional neural network and energy-integrating CT training labels.

Loading...
Thumbnail Image

Date

2022-06-29

Journal Title

Journal ISSN

Volume Title

Citation Stats

Attention Stats

Abstract

Objective

Photon-counting CT (PCCT) has better dose efficiency and spectral resolution than energy-integrating CT, which is advantageous for material decomposition. Unfortunately, the accuracy of PCCT-based material decomposition is limited due to spectral distortions in the photon-counting detector (PCD).

Approach

In this work, we demonstrate a deep learning (DL) approach that compensates for spectral distortions in the PCD and improves accuracy in material decomposition by using decomposition maps provided by high-dose multi-energy-integrating detector (EID) data as training labels. We use a 3D U-net architecture and compare networks with PCD filtered backprojection (FBP) reconstruction (FBP2Decomp), PCD iterative reconstruction (Iter2Decomp), and PCD decomposition (Decomp2Decomp) as the input.

Main results

We found that our Iter2Decomp approach performs best, but DL outperforms matrix inversion decomposition regardless of the input. Compared to PCD matrix inversion decomposition, Iter2Decomp gives 27.50% lower root mean squared error (RMSE) in the iodine (I) map and 59.87% lower RMSE in the photoelectric effect (PE) map. In addition, it increases the structural similarity (SSIM) by 1.92%, 6.05%, and 9.33% in the I, Compton scattering (CS), and PE maps, respectively. When taking measurements from iodine and calcium vials, Iter2Decomp provides excellent agreement with multi-EID decomposition. One limitation is some blurring caused by our DL approach, with a decrease from 1.98 line pairs/mm at 50% modulation transfer function (MTF) with PCD matrix inversion decomposition to 1.75 line pairs/mm at 50% MTF when using Iter2Decomp.

Significance

Overall, this work demonstrates that our DL approach with high-dose multi-EID derived decomposition labels is effective at generating more accurate material maps from PCD data. More accurate preclinical spectral PCCT imaging such as this could serve for developing nanoparticles that show promise in the field of theranostics (therapy and diagnostics).

Department

Description

Provenance

Citation

Published Version (Please cite this version)

10.1088/1361-6560/ac7d34

Publication Info

Nadkarni, Rohan, Alex Allphin, Darin P Clark and Cristian T Badea (2022). Material decomposition from photon-counting CT using a convolutional neural network and energy-integrating CT training labels. Physics in medicine and biology. 10.1088/1361-6560/ac7d34 Retrieved from https://hdl.handle.net/10161/25502.

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.

Scholars@Duke

Clark

Darin Clark

Assistant Professor in Radiology
Badea

Cristian Tudorel Badea

Professor in Radiology

  • Our lab's research focus lies primarily in developing novel quantitative imaging systems, reconstruction algorithms and analysis methods.  My major expertise is in preclinical CT.
  • Currently, we are particularly interested in developing novel strategies for spectral CT imaging using nanoparticle-based contrast agents for theranostics (i.e. therapy and diagnostics).
  • We are also engaged in developing new approaches for multidimensional CT image reconstruction suitable to address difficult undersampling cases in cardiac and spectral CT (dual energy and photon counting) using compressed sensing and/or deep learning.



Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.