Multi-spectral Deep Tissue Quantitative Photoacoustic Imaging

Limited Access
This item is unavailable until:
2024-09-14

Date

2023

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

10
views
0
downloads

Abstract

Photoacoustic tomography (PAT) detects the acoustic signals generated by optical absorption from chromophores. Using oxy- and deoxy-hemoglobin, multi-spectral PAT can image vasculature structure, and provide functional information. Most of the current PAT imaging studies face four challenges: heterogeneous fluence distribution in the field of view, shallow penetration depth with external illumination, the limited view problem, and the lack of blood flow sensitivity. These challenges degrade the PAT image quality and limit the functional information accuracy. To improve image fidelity, four approaches were proposed to address the challenges. A 3D Monte Carlo simulation was performed on a mouse brain model for optical fluence estimation at different imaging wavelengths, and the 3D optical fluence map was used for fluence heterogeneity correction. Moreover, an internal illumination strategy combined with a treatment catheter was applied to achieve a deep tissue PAT imaging, allowing PAT imaging guided sonothrombolysis with clot characterization. Additionally, we develop a clinical-translatable method to preserve the functional information from the hemoglobin while improving the structure visibility from vessels at arbitrary orientation. Finally, we integrated ultrasound localization microscopy with PAT to allow non-invasive and comprehensive functional imaging at low frequency. Our experiment results based on phantoms and in vivo animal studies have collectively demonstrated that the PAT image fidelity can be greatly improved.

Description

Provenance

Citation

Citation

Tang, Yuqi (2023). Multi-spectral Deep Tissue Quantitative Photoacoustic Imaging. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/29181.

Collections


Dukes student scholarship is made available to the public using a Creative Commons Attribution / Non-commercial / No derivative (CC-BY-NC-ND) license.