Browsing by Subject "Imaging"
Results Per Page
Sort Options
Item Open Access 3D Microwave Imaging through Full Wave Methods for Heterogenous Media(2011) Yuan, MengqingIn this thesis, a 3D microwave imaging method is developed for a microwave imaging system with an arbitrary background medium. In the previous study on the breast cancer detection of our research group, a full wave inverse method, the Diagonal Tensor approximation combined with Born Iterative Method (DTA-BIM), was proposed to reconstruct the electrical profile of the inversion domain in a homogenous background medium and a layered background medium. In order to evaluate the performance of the DTA-BIM method in a realistic microwave imaging system, an experimental prototype of an active 3D microwave imaging system with movable antennas is constructed. For the objects immersed in a homogenous background medium or a layered background medium, the inversion results based on the experimental data show that the resolution of the DTA-BIM method can reach finely to a quarter of wavelength of the background medium, and the system's signal-noise-ratio (SNR) requirement is 10 dB. Moreover, the defects of this system make it difficult to be implemented in a realistic application. Thus, another active 3D microwave imaging system is proposed to overcome the problems in the previous system. The new system employs a fix patch antenna array with electric switch to record the data. However, the antenna array makes the inversion system become a non-canonical inhomogeneous background. The analytical Greens' functions used in the original DTA-BIM method become unavailable. Thus, a modified DTA-BIM method, which use the numerical Green's functions combined with measured voltage, is proposed. This modified DTA-BIM method can be used to the inversion in a non-canonical inhomogeneous background with the measured voltages (or $S_{21}$ parameters). In order to verify the performance of this proposed inversion method, we investigate a prototype 3D microwave imaging system with a fix antenna array. The inversion results from the synthetic data show that this method works well with a fix antenna array, and the resolution of reconstructed images can reach to a quarter wavelength even in the presence of a strongly inhomogeneous background medium and antenna couplings. A time-reversal method is introduced as a pre-processing step to reduce the region of interest (ROI) in our inversion. In addition, a Multi-Domain DTA-BIM method is proposed to fit the discontinue inversion regions. With these improvements, the size of the inversion domain and the computational cost can be significantly reduced, and make the DTA-BIM method more feasible for rapid response applications.
Item Embargo Assessing Astatine-211 SPECT Image Quality in Relevant Organs(2024) Wong, Ye Wan EvanTheranostics is an evolving approach in nuclear medicine that aims to combine diagnostic and therapeutic value into a single agent of delivery. With increased interest in alpha-emitting radionuclides for their short effective range and high linear energy transfer, astatine-211 is a promising radionuclide for therapy applications. Previously at Duke University, the ability to image and quantitate images of astatine-211 was investigated and determined to be a challenge due to attenuation and collimation effects on desired photons for imaging, and undesirable high energy emission contributions. This research builds on that previous work to investigate the image quality extent of single photon planar and SPECT imaging for astatine-211 when considering relevant organs that could be at risk for radiation damage based on the distribution of the molecule carrying the At-211. The investigation is broken down into several experiments that provide the basis for understanding the potential of astatine-211 to perform as an imaging radionuclide, and the needed factors for image reconstruction including the appropriate linear attenuation coefficient and k-factor for dual-energy scatter correction. Two phantom designs were created. One was used to provide a baseline image quality comparison of four radionuclides (F-18 for PET, Tc-99m, Lu-177, and At-211 for single photon planar imaging). The other represented the salivary glands in the head and kidneys and tumors in the torso. Imaging the same realistically large phantom showed that only the fluorine-18 PET images 1 cm targets successfully, while technetium-99m and lutetium-177 are comparable in imaging 2 cm and 3 cm targets, and astatine-211 can only image 3 cm targets. This work successfully simulated the salivary glands and kidneys in an anthropomorphic phantom. The results indicated that the use of a k-factor of 1.1 is reasonable in the scatter correction of imaging astatine-211, which effectively reduced downscattered gamma rays in the images. Additionally, the results confirm that the medium energy general purpose collimator is better suited than the low energy high resolution collimator for imaging astatine-211 with improved SNR and comparable noise quality.
Item Open Access Automated Microscopy and High Throughput Image Analysis in Arabidopsis and Drosophila(2009) Mace, Daniel L.Development of a single cell into an adult organism is accomplished through an elaborate and complex cascade of spatiotemporal gene expression. While methods exist for capturing spatiotemporal expression patterns---in situ hybridization, reporter constructs, fluorescent tags---these methods have been highly laborious, and results are frequently assessed by subjective qualitative comparisons. To address these issues, methods must be developed for automating the capture of images, as well as for the normalization and quantification of the resulting data. In this thesis, I design computational approaches for high throughput image analysis which can be grouped into three main areas. First, I develop methods for the capture of high resolution images from high throughput platforms. In addition to the informatics aspect of this problem, I also devise a novel multiscale probabilistic model that allows us to identify and segment objects in an automated fashion. Second, high resolution images must be registered and normalized to a common frame of reference for cross image comparisons. To address these issues, I implement approaches for image registration using statistical shape models and non-rigid registration. Lastly, I validate the spatial expression data obtained from microscopy images to other known spatial expression methods, and develop methods for comparing and calculating the significance between spatial expression patterns. I demonstrate these methods on two model developmental organisms: Arabidopsis and Drosophila.
Item Open Access CLARITY and PACT-based imaging of adult zebrafish and mouse for whole-animal analysis of infections.(Dis Model Mech, 2015-12) Cronan, Mark R; Rosenberg, Allison F; Oehlers, Stefan H; Saelens, Joseph W; Sisk, Dana M; Jurcic Smith, Kristen L; Lee, Sunhee; Tobin, David MVisualization of infection and the associated host response has been challenging in adult vertebrates. Owing to their transparency, zebrafish larvae have been used to directly observe infection in vivo; however, such larvae have not yet developed a functional adaptive immune system. Cells involved in adaptive immunity mature later and have therefore been difficult to access optically in intact animals. Thus, the study of many aspects of vertebrate infection requires dissection of adult organs or ex vivo isolation of immune cells. Recently, CLARITY and PACT (passive clarity technique) methodologies have enabled clearing and direct visualization of dissected organs. Here, we show that these techniques can be applied to image host-pathogen interactions directly in whole animals. CLARITY and PACT-based clearing of whole adult zebrafish and Mycobacterium tuberculosis-infected mouse lungs enables imaging of mycobacterial granulomas deep within tissue to a depth of more than 1 mm. Using established transgenic lines, we were able to image normal and pathogenic structures and their surrounding host context at high resolution. We identified the three-dimensional organization of granuloma-associated angiogenesis, an important feature of mycobacterial infection, and characterized the induction of the cytokine tumor necrosis factor (TNF) within the granuloma using an established fluorescent reporter line. We observed heterogeneity in TNF induction within granuloma macrophages, consistent with an evolving view of the tuberculous granuloma as a non-uniform, heterogeneous structure. Broad application of this technique will enable new understanding of host-pathogen interactions in situ.Item Open Access Coded Aperture X-ray Tomographic Imaging with Energy Sensitive Detectors(2017) Hassan, MehadiCoherent scatter imaging techniques have experienced a renaissance in the past two decades from an evolution of detector technology and computational imaging techniques. X-ray diffraction requires a precise knowledge of object location and is time consuming; transforming diffractometry into a practical imaging technique involves spatially resolving the sample in 3-dimensions and speeding up the measurement process. The introduction of a coded aperture in a conventional X-ray diffraction system provides 3D localization of the scatterer as well as drastic reductions in the acquisition time due to the ability to perform multiplexed measurements. This theses document contains two strategies involving coded apertures to address the aforementioned challenges of X-ray coherent scatter measurements.
The first technique places the coded aperture between source and object to structure the incident illumination. A single pixel detector captures temporally modulated coherent scatter data from an object as it travels through the illumination. From these measurements, 2D spatial and 1D spectral information is recovered at each point within a planar slice of an object. Compared to previous techniques, this approach is able to reduce the overall scan time of objects by 1-2 orders of magnitude.
The second measurement technique demonstrates snapshot coherent scatter tomography. A planar slice of an object is illuminated by a fan beam and the scatter data is modulated by a coded aperture between object and detector. The spatially modulated data is captured with a linear array of energy sensitive detectors, and the recovered data shows that the system can image objects that are 13 mm in range and 2 mm in cross range with a fractional momentum transfer resolution of 15\%. The technique also allows a 100x speedup when compared to pencil beam systems using the same components.
Continuing with the theme of snapshot tomography with energy sensitive detectors, I study the impact of detectors properties such as detection area, choice of energies and energy resolution for pencil and fan beam coded aperture coherent scatter systems. I simulate various detector geometries and determine that energy resolution has the largest impact for pencil beam geometries while detector area has the largest impact for fan beam geometries. These results can be used to build detectors which can potentially help implement pencil and/or fan beam coded aperture coherent scatter systems in applications involving medicine and security.
Item Open Access Coherence in Dynamic Metasurface Aperture Microwave Imaging Systems(2020) Diebold, Aaron VincentMicrowave imaging systems often utilize electrically large arrays for remote characterization of spatial and spectral content. Image reconstruction involves computational processing, the success of which depends on adequate spatial and temporal sampling at the array as dictated by the nature of the radiation and sensing strategy. Effective design of an imaging system, consisting of its hardware and algorithmic components, thus requires detailed understanding of the nature of the involved fields and their impact on the processing capabilities. One can sufficiently characterize many of the properties of such fields and systems via coherence, which quantifies their interferometric capacity in terms of statistical correlations and point spread functions. Recent work on dynamic metasurface apertures (DMAs) for microwave imaging has demonstrated the utility of these structures in active, coherent systems, supplanting traditional array architectures with lower-cost designs capable of powerful wavefront shaping. In contrast to arrays of distinct antennas, DMAs are composed of electrically large arrays of dynamically-tunable, radiating metamaterial elements to realize diverse gain patterns that can function as an encoding mechanism for coded aperture image reconstruction. With the appropriate formulation, well-established concepts from the realm of Fourier optics can be transposed from conventional array systems to DMA architectures. This dissertation furthers that task by modeling DMA imaging systems involving partial coherence and incoherence, and demonstrating new reconstruction algorithms in these contexts. Such an undertaking provides convenient opportunities for examining the origins of coherence in computational and holographic imaging systems. This insight is necessary for the development of modern approaches that seek to smoothly integrate hardware and computational elements for powerful, efficient, and innovative imaging tasks.
Accommodating different degrees of coherence in a microwave imaging system can substantially relax demands on hardware components including phase stability and synchronization, or on algorithmic procedures such as calibration. In addition, incoherent operation can yield improved images free of coherent diffraction artifacts and speckle. Finally, an understanding of coherence can unlock fundamentally distinct applications, such as passive imaging and imaging with ambient illumination, that can benefit from the flexibility of a DMA system but have yet to be demonstrated under such an architecture. To this end, I formulate a unified framework for analyzing and processing array imaging systems in the Fourier domain, and demonstrate a method for transforming a DMA-based system to an equivalent array representation under active, coherent operation. I then investigate the role of spatial coherence in a two-dimensional holographic imaging system, and experimentally demonstrate some results using a collection of DMAs. I conduct a similar investigation in the context of single-pixel ghost imaging, which allows coherent and incoherent imaging directly from intensity measurements, thereby relaxing hardware phase requirements. I then formulate a model for partially coherent fields in a DMA imaging system, and provide several reconstruction strategies and example simulations. I finally restrict this general case to passive, spectral imaging of spatially and temporally incoherent sources, and experimentally demonstrate a compressive imaging strategy in this context.
Item Open Access Collimation of a D-D Neutron Generator for Clinical Implementation of Neutron Stimulated Emission Computed Tomography: a Monte Carlo Study(2016) Fong, GrantThis work is an investigation into collimator designs for a deuterium-deuterium (DD) neutron generator for an inexpensive and compact neutron imaging system that can be implemented in a hospital. The envisioned application is for a spectroscopic imaging technique called neutron stimulated emission computed tomography (NSECT).
Previous NSECT studies have been performed using a Van-de-Graaff accelerator at the Triangle Universities Nuclear Laboratory (TUNL) in Duke University. This facility has provided invaluable research into the development of NSECT. To transition the current imaging method into a clinically feasible system, there is a need for a high-intensity fast neutron source that can produce collimated beams. The DD neutron generator from Adelphi Technologies Inc. is being explored as a possible candidate to provide the uncollimated neutrons. This DD generator is a compact source that produces 2.5 MeV fast neutrons with intensities of 1012 n/s (4π). The neutron energy is sufficient to excite most isotopes of interest in the body with the exception of carbon and oxygen. However, a special collimator is needed to collimate the 4π neutron emission into a narrow beam. This work describes the development and evaluation of a series of collimator designs to collimate the DD generator for narrow beams suitable for NSECT imaging.
A neutron collimator made of high-density polyethylene (HDPE) and lead was modeled and simulated using the GEANT4 toolkit. The collimator was designed as a 52 x 52 x 52 cm3 HDPE block coupled with 1 cm lead shielding. Non-tapering (cylindrical) and tapering (conical) opening designs were modeled into the collimator to permit passage of neutrons. The shape, size, and geometry of the aperture were varied to assess the effects on the collimated neutron beam. Parameters varied were: inlet diameter (1-5 cm), outlet diameter (1-5 cm), aperture diameter (0.5-1.5 cm), and aperture placement (13-39 cm). For each combination of collimator parameters, the spatial and energy distributions of neutrons and gammas were tracked and analyzed to determine three performance parameters: neutron beam-width, primary neutron flux, and the output quality. To evaluate these parameters, the simulated neutron beams are then regenerated for a NSECT breast scan. Scan involved a realistic breast lesion implanted into an anthropomorphic female phantom.
This work indicates potential for collimating and shielding a DD neutron generator for use in a clinical NSECT system. The proposed collimator designs produced a well-collimated neutron beam that can be used for NSECT breast imaging. The aperture diameter showed a strong correlation to the beam-width, where the collimated neutron beam-width was about 10% larger than the physical aperture diameter. In addition, a collimator opening consisting of a tapering inlet and cylindrical outlet allowed greater neutron throughput when compared to a simple cylindrical opening. The tapering inlet design can allow additional neutron throughput when the neck is placed farther from the source. On the other hand, the tapering designs also decrease output quality (i.e. increase in stray neutrons outside the primary collimated beam). All collimators are cataloged in measures of beam-width, neutron flux, and output quality. For a particular NSECT application, an optimal choice should be based on the collimator specifications listed in this work.
Item Open Access Correlated Polarity Noise Reduction: Development, Analysis, and Application of a Novel Noise Reduction Paradigm(2013) Wells, Jered RImage noise is a pervasive problem in medical imaging. It is a property endemic to all imaging modalities and one especially familiar in those modalities that employ ionizing radiation. Statistical uncertainty is a major limiting factor in the reduction of ionizing radiation dose; patient exposure must be minimized but high image quality must also be achieved to retain the clinical utility of medical images. One way to achieve the goal of radiation dose reduction is through the use of image post processing with noise reduction algorithms. By acquiring images at lower than normal exposure followed by algorithmic noise reduction, it is possible to restore image noise to near normal levels. However, many denoising algorithms degrade the integrity of other image quality components in the process.
In this dissertation, a new noise reduction algorithm is investigated: Correlated Polarity Noise Reduction (CPNR). CPNR is a novel noise reduction technique that uses a statistical approach to reduce noise variance while maintaining excellent resolution and a "normal" noise appearance. In this work, the algorithm is developed in detail with the introduction of several methods for improving polarity estimation accuracy and maintaining the normality of the residual noise intensity distribution. Several image quality characteristics are assessed in the production of this new algorithm including its effects on residual noise texture, residual noise magnitude distribution, resolution effects, and nonlinear distortion effects. An in-depth review of current linear methods for medical imaging system resolution analysis will be presented along with several newly discovered improvements to existing techniques. This is followed by the presentation of a new paradigm for quantifying the frequency response and distortion properties of nonlinear algorithms. Finally, the new CPNR algorithm is applied to computed tomography (CT) to assess its efficacy as a dose reduction tool in 3-D imaging.
It was found that the CPNR algorithm can be used to reduce x ray dose in projection radiography by a factor of at least two without objectionable degradation of image resolution. This is comparable to other nonlinear image denoising algorithms such as the bilateral filter and wavelet denoising. However, CPNR can accomplish this level of dose reduction with few edge effects and negligible nonlinear distortion of the anatomical signal as evidenced by the newly developed nonlinear assessment paradigm. In application to multi-detector CT, XCAT simulations showed that CPNR can be used to reduce noise variance by 40% with minimal blurring of anatomical structures under a filtered back-projection reconstruction paradigm. When an apodization filter was applied, only 33% noise variance reduction was achieved, but the edge-saving qualities were largely retained. In application to cone-beam CT for daily patient positioning in radiation therapy, up to 49% noise variance reduction was achieved with as little as 1% reduction in the task transfer function measured from reconstructed data at the cutoff frequency.
This work concludes that the CPNR paradigm shows promise as a viable noise reduction tool which can be used to maintain current standards of clinical image quality at almost half of normal radiation exposure This algorithm has favorable resolution and nonlinear distortion properties as measured using a newly developed set of metrics for nonlinear algorithm resolution and distortion assessment. Simulation studies and the initial application of CPNR to cone-beam CT data reveal that CPNR may be used to reduce CT dose by 40%-49% with minimal degradation of image resolution.
Item Open Access Design and Implementation of an Institution-Wide Patient-Specific Radiation Dose Monitoring Program for Computed Tomography, Digital Radiography, and Nuclear Medicine(2011) Christianson, OlavRecently, there has been renewed interest in decreasing radiation dose to patients from diagnostic imaging procedures. So far, efforts to decrease radiation dose have focused on the amount of radiation delivered from typical techniques and fail to capture the variation in radiation dose between patients. Despite the feasibility of estimating patient-specific radiation doses and the potential for this practice to aid in protocol optimization, it is not currently standard procedure for hospitals to monitor radiation dose for all patients. To address this shortcoming, we have developed an institution-wide patient-specific radiation dose monitoring program for computed tomography, digital radiography, and nuclear medicine.
Item Open Access Development and application of enhanced, high-resolution physiological features in XCAT phantoms for use in virtual clinical trials(2023) Sauer, ThomasVirtual imaging trials (VITs) are a growing part of medical imaging research. VITs are a powerful alternative to the current gold-standard for determining or verifying the efficacy of new technology in healthcare: the clinical trial. Prohibitively high expenses, multi-site standardization of protocols, and risks to the health of the trial’s patient population are all challenges associated with the clinical trial; conversely, these challenges highlight the strengths of virtualization, particularly with regard to evaluating medical imaging technologies.Virtual imaging requires a combination of virtual subjects, physics-based imaging simulation platforms, and virtual pathologies. Currently, most computational phantom organs and pathologies are segmented or generated from clinical CT images. With this approach, most computational organs and pathologies are necessarily static, comprising only a single instantaneous representation. Further, this static-anatomy–static-pathology approach does not address the underlying physiological constraints acting on the organs or their pathologies—making some imaging exams (e.g., perfusion, coronary angiography) difficult to simulate robustly. It also does not provide a clear path toward including anatomical and physiological (functional) detail at sub-CT resolution. This project aims to integrate high-resolution, dynamic features into computational human models. The focus is primarily an advanced model known as XCAT. These additions include healthy and progressive-disease anatomy and physiology, micron-level–resolution coronary artery lesions, and an array of pathologies. In particular, we focus on the physiology needed for CT perfusion studies, dynamic lesions, or coronary artery disease (CAD), and means to integrate each of these features into XCAT via custom software. The outcome is further to demonstrate the utility of each of these advances with representative simulated imaging. Chapter 1 presents a method using clinical information and physiological theory to develop a mathematical model that produces the liver vasculature within a given XCAT. The model can be used to simulate contrast perfusion by taking into account contrast position and concentration at an initial time t and the spatial extent of the contrast in the liver vasculature at subsequent times. The mathematical method enables the simulation of hepatic contrast perfusion in the presence or absence of abnormalities (e.g., focal or diffuse disease) for arbitrary imaging protocols, contrast concentrations, and virtual patient body habitus. The vessel growing method further generalizes to vascular models of other organs as it is based on a parameterized approach, allowing for flexible repurposing of the developed tool. Chapter 2 presents a method for using cardiac plaque histology and morphology data acquired at micron-level resolution to generate new, novel plaques informed by a large, original patient cohort. A methodology for curating and validating the anatomical and physiological realism was further applied to the synthesized plaques to ensure realism. This method was integrated with the XCAT heart and coronary artery models to allow simulated imaging of a wide variety of coronary artery plaques in varied orientations and with unique material distribution and composition. Generation of 200 unique plaques has been optimized to take as little as 5 seconds with GPU acceleration. This work enables future studies to optimize current and emerging CT imaging methods used to detect, diagnose, and treat coronary artery disease. Chapter 3 focuses on small-scale modeling of the internal structure of the bones of the chest. The internal structure of the bones appears as a diffuse but recognizable texture under medical imaging and corresponds to a complex physical structure tuned to meet the physical purpose of the bone (e.g., weight-bearing, protective structure, etc.). The project aimed to address the limitations of prior texture-based modelling by creating mathematically based fine bone structures. The method was used to generate realistic bone structures, defined as polygon meshes, with accurate morphological and topological detail for 45 chest bones for each XCAT phantom. This new method defines the spatial extent of the complementary bone–marrow structures that are the root cause of the characteristic image texture 1-4 and provides a transition from using image-informed characteristic power law textures to a ground-truth model with exact morphology—which we additionally paired with the DukeSim CT simulator5 and XCAT phantoms6 to produce radiography and CT images with physics-based bone textures. This work enables CT acquisition parameter optimization studies that can inform clinical image assessment of osteoporosis and bone fractures. Chapter 4 proposes a new model of lesion morphology and insertion and was created with the intent to be informed and validated by—rather than constrained by—imaging data. It additionally includes the new incorporation of biological data, intended to provide dynamic computational lung lesion models for use in CT simulation applications. Each chapter includes a section presenting an example application of the respective tools in virtual medical imaging. Chapter 5 concludes this work with a brief summary of the content and is followed by Appendices A–D. The appendices are organized by topic and contain a visual demonstration of the work in a series of high-resolution, full-page images.
Item Open Access Electromagnetic Forward Modeling and Inversion for Geophysical Exploration(2015) Jia, YuElectromagnetic forward modeling and inversion methods have extensive applications in geophysical exploration, and large-scale controlled-source electromagnetic method has recently drawed lots of attention. However, to obtain a rigorous and efficient forward solver for this large-scale three-dimensional problem is difficult, since it usually requires to solve for a large number of unknowns from a system of equations describing the complicate scattering behavior of electromagnetic waves that happened within inhomogeneous media. As for the development of an efficient inversion solver, because of the nonlinear, non-unique and ill-posed properties of the problem, it is also a very challenging task.
In the first part of this dissertation, a fast three-dimensional nonlinear reconstruction method is proposed for controlled-source electromagnetic method. The borehole-to-surface and airborne electromagnetic survey methods are investigated using synthetic data. In this work, it is assumed that there is only electric contrast between the inhomogeneous object and the layered background medium, for the reason that the electric contrast is much more dominant than magnetic contrast in most cases of the earth formation. Therefore, only the EFIE is needed to solve. While the forward scattering problem is solved by the stabilized bi-conjugate gradient FFT (BCGS-FFT) method to give a rigorous and efficient modeling, the Bore iterative method along with the multiplicative regularization technique is used in the inversion through frequency hopping. In the inversion, to speed up the expensive computation of the sensitivity matrix relating every receiver station to every unknown element, a fast field evaluation (FFE) technique is proposed using the symmetry property of the layered medium Green's function combined with a database strategy. The conjugate-gradient method is then applied to minimize the cost function after each iteration. Due to the benefits of using 3D FFT acceleration, the proposed FFE technique as well as the recursive matrix method combined with an interpolation technique to evaluate the LMGF, the developed inversion solver is highly efficient, and requires very low computation time and memory. Numerical experiments for both 3D forward modeling and conductivity inversion are presented to show the accuracy and efficiency of the method.
Some recent research on artificial nanoparticles have demonstrated the improved performance in geophysical imaging using magnetodielectric materials with enhanced electric and magnetic contrasts. This gives a promising perspective to the future geophysical exploration by infusing well-designed artificial magnetodielectric materials into the subsurface objects, so that a significantly improved imaging can be achieved. As a preparation for this promising application, the second part of the dissertation presents an efficient method to solve the scattering problem of magnetodielectric materials with general anisotropy that are embedded in layered media. In this work, the volume integral equation is chosen as the target equation to solve, since it solves for fields in inhomogeneous media with less number of unknowns than the finite element method. However, for complicated materials as magnetodielectric materials with general anisotropy, it is a very challenging task, because it requires to simultaneously solve the electric field integral equation (EFIE) and magnetic field integral equation (MFIE). Besides that, the numerous evaluation of the layered medium Green's function (LMGF) for the stratified background formation adds on the difficulty and complexity of the problem. To my knowledge, there is no existing fast solver for the similar problem. In this dissertation, using the mixed order stabilized biconjugate-gradient fast Fourier transform (mixed-order BCGS-FFT) method, a fast forward modeling method is developed to solve this challenging problem. Several numerical examples are performed to validate the accuracy and efficiency of the proposed method.
Besides the above mentioned two topics, one-dimensional inversion method is presented in the third part to determine the tilted triaxial conductivity tensor in a dipping layered formation using triaxial induction measurements. The tilted triaxial conductivity tensor is described by three conductivity components and three Euler angles. Based on my knowledge, due to the highly nonlinear and ill-posed nature of the inverse problem, this study serves as the first work that investigates on the subject. There are six principal coordinate systems that can give the same conductivity tensor. Permutation is performed to eliminate the ambiguity of inversion results caused by the ambiguity of the principal coordinate system. Three new Euler angles after permutation for each layer can be found by solving a nonlinear equation. Numerical experiments are conducted on synthetic models to study the feasibility of determining triaxially anisotropic conductivity tensor from triaxial induction data. This project is accomplished during my internship in the Houston Formation Evaluation Integration Center (HFE) at Schlumberger, a world-leading oilfield service company.
Item Open Access Evaluation of a Dedicated SPECT-CT Mammotomography System for Quantitative Hybrid Breast Imaging(2010) Cutler, Spencer JohnsonThe overall goal of this dissertation is to optimize and evaluate the performance of the single photon emission computed tomography (SPECT) subsystem of a dedicated three-dimensional (3D) dual-modality breast imaging system for enhanced semi-automated, quantitative clinical imaging. This novel hybrid imaging system combines functional or molecular information obtained with a SPECT subsystem with high-resolution anatomical imaging obtained with a low dose x-ray Computed Tomography (CT) subsystem. In this new breast imaging paradigm, coined "mammotomography," the subject is imaged lying prone while the individual subsystems sweep 3-dimensionally about her uncompressed, pendant breast, providing patient comfort compared to traditional compression-based imaging modalities along with high fidelity and information rich images for the clinician.
System evaluation includes a direct comparison between dedicated 3D SPECT and dedicated 2D scintimammography imaging using the same high performance, semi-conductor gamma camera. Due to the greater positioning flexibility of the SPECT system gantry, under a wide range of measurement conditions, statistically significantly (p<0.05) more lesions and smaller lesion sizes were detected with dedicated breast SPECT than with compressed breast scintimammography. The importance of good energy resolution for uncompressed SPECT breast imaging was also investigated. Results clearly illustrate both visual and quantitative differences between the various energy windows, with energy windows slightly wider than the system resolution having the best image contrast and quality.
An observer-based contrast-detail study was performed in an effort to evaluate the limits of object detectability under various imaging conditions. The smallest object detail was observed using a 45-degree tilted trajectory acquisition. The complex 3D projected sine wave acquisition, however, had the most consistent combined intra and inter-observer results, making it potentially the best imaging approach for consistent clinical imaging.
Automatic ROR contouring is implemented using a dual-layer light curtain design, ensuring that an arbitrarily shaped breast is within ~1 cm of the camera face, but no closer than 0.5 cm at every projection angle of a scan. Autocontouring enables simplified routine scanning using complex 3D trajectories, and yields improved image quality. Absolute quantification capabilities are also integrated into the SPECT system, allowing the calculation of in vivo total lesion activity. Initial feasibility studies in controlled low noise experiments show promising results with total activity agreement within 10% of the dose calibrator values.
The SPECT system is integrated with a CT scanner for added diagnostic power. Initial human subject studies demonstrate the clinical potential of the hybrid SPECT-CT breast imaging system. The reconstructed SPECT-CT images illustrate the power of fusing functional SPECT information to localize lesions not easily seen in the anatomical CT images. Enhanced quantitative 3D SPECT-CT breast imaging, now with the ability to dynamically contour any sized breast, has high potential to improve detection, diagnosis, and characterization of breast cancer in upcoming larger-scale clinical testing.
Item Open Access Functional Spectral Domain Optical Coherence Tomography Imaging(2009) Bower, Bradley A.Spectral Domain Optical Coherence Tomography (SDOCT) is a high-speed, high resolution imaging modality capable of structural and functional resolution of tissue microstructure. SDOCT fills a niche between histology and ultrasound imaging, providing non-contact, non-invasive backscattering amplitude and phase from a sample. Due to the translucent nature of the tissue, ophthalmic imaging is an ideal space for SDOCT imaging.
Structural imaging of the retina has provided new insights into ophthalmic disease. The phase component of SDOCT images remains largely underexplored, though. While Doppler SDOCT has been explored in a research setting, it remains to catch on in the clinic. Other, functional exploitations of the phase are possible and necessary to expand the utility of SDOCT. Spectral Domain Phase Microscopy (SDPM) is an extension of SDOCT that is capable of resolving sub-wavelength displacements within a focal volume. Application of sub-wavelength displacement measurement ophthalmic imaging could provide a new method for imaging of optophysiology.
This body of work encompasses both hardware and software design and development for implementation of SDOCT. Structural imaging was proven in both the lab and the clinic. Coarse phase changes associated with Doppler flow frequency shifts were recorded and a study was conducted to validate Doppler measurement. Fine phase changes were explored through SDPM applications. Preliminary optophysiology data was acquired to study the potential of sub-wavelength measurements in the retina. To remove the complexity associated with in-vivo human retinal imaging, a first principles approach using isolated nerve samples was applied using standard SDPM and a depth-encoded technique for measuring conduction velocity.
Results from amplitude as well as both coarse and fine phase processing are presented. In-vivo optophysiology using SDPM is a promising avenue for exploration, and projects furthering or extending this body of work are discussed.
Item Open Access High Resolution X-ray Microscopy Using Digital Subtraction Angiography for Small Animal Functional Imaging(2008-08-04) Lin, Ming DeResearch using mice and rats has gained interest because they are robust test beds for clinical drug development and are used to elucidate disease etiologies. Blood vessel visualization and blood flow measurements are important anatomic and physiologic indicators to drug/disease stimuli or genetic modification. Cardio-pulmonary blood flow is an important indicator of heart and lung performance. Small animal functional imaging provides a way to measure physiologic changes minimally-invasively while the animal is alive, thereby allowing for multiple measurements in the same animal with little physiologic perturbation. Current methods of measuring cardio-pulmonary blood flow suffer from some or all of these limitations-they produce relative measurements, are limited to global or whole animal or organ regions, do not provide vasculature visualization, limited to a few or singular samples per animal, are not able to measure acute changes, or are very invasive or requires animal sacrifice. The focus of this work was the development of a small animal x-ray imaging system capable of minimally invasive real-time, high resolution vascular visualization, and cardio-pulmonary blood flow measurements in the live animal. The x-ray technique used was digital subtraction angiography (DSA). This technique is a particularly appealing approach because it is easy to use, can capture rapid physiological changes on a heart beat-to-beat basis, and provides anatomical and functional vasculature information. This DSA system is special because it was designed and implemented from the ground up to be optimized for small animal imaging and functional measurements. This system can perform: 1) minimally invasive in vivo blood flow measurements, 2) multiple measurements in the same animal in a rapid succession (every 30 seconds-a substantial improvement over singular measurements that require minutes to acquire by the Fick method), 3) very high resolution (up to 46 micron) vascular visualization, 4) quantitative blood flow measurements in absolute metrics (mL/min instead of arbitrary units or velocity) and relative blood volume dynamics from discrete ROIs, and 5) relative mean transit time dynamics on a pixel-by-pixel basis (100 µm x 100 µm). The end results are 1) anatomical vessel time course images showing the contrast agent flowing through the vasculature, 2) blood flow information of the live rat cardio-pulmonary system in absolute units and relative blood volume information at discrete ROIs of enhanced blood vessels, and 3) colormaps of relative transit time dynamics. This small animal optimized imaging system can be a useful tool in future studies to measure drug or disease modulated blood flow dynamics in the small animal.
Item Open Access Hybrid Reference Datasets for Quantitative Computed Tomography Characterization and Conformance(2018) Robins, MarthonyX-ray computed tomography (CT) imaging is the second most commonly used clinical imaging modality with an estimated 82 million clinical exams performed in the U.S in 2016. Despite an average annual decline of 2% since a high of 85.3 million in 2011, it is highly sought for visualizing a host of medical conditions because of its clinical advantages in providing high spatial resolution and fast imaging time. Although limited, the high resolution of CT imaging enables small objects such as lesions to be realized with good detail. Partly due to their size and the fact that CT is noise and resolution limited, the effects of system resolution and lesion characterization processes (i.e., segmentation and CAD algorithms) are challenging to quantify. For this reason, there is a significant need to account for system resolution and algorithm impact on lesion characterization in a quantitatively reproducible manner.
Cancer is the second leading cause of death in the U.S. A fundamental aspect of cancer diagnosis, treatment and management is effective use of medical imaging. In recent years, cancer screening has received significant attention. In fact, results of screening suggest that early cancer detection can result in higher survival rates.
Beyond just visual inspection, extraction of quantitative lesion features could provide more diagnostic and treatment benefits. Assessing the quantitative capabilities of CT systems is complicated by technical factors such as noise, blur, and motion artifacts. As such, traditional modulation transfer function (MTF) methods are insufficient in characterizing system resolution, especially when non-linearities are introduced by iterative reconstruction. These aforementioned factors contribute to a major component of lesion characterization uncertainty in that they limit the apprehension of lesion ground truth. That being said, there is a wealth of quantifiable information that can be garnered from clinical images, since lesion size, morphology, and potentially texture (i.e., internal heterogeneities) are important quantitative biomarkers for effective clinical decision-making. Considering this, the imaging physics community is steadily progressing toward a quantitative paradigm in CT.
As such, the purpose of this doctoral project was to develop, validate, and disseminate a new phantom, image databases and assessment tools that are appropriate for ground truth lesion characterization in the context of modern x-ray computed tomography (CT) systems. The project developed lesion assessment methods in the framework of two distinct modes, (a) anthropomorphic phantoms and (b) clinical images.
As an alternative to the MTF, the first aspect of this project aimed at validating the task transfer function (TTF), which is a quantitative measure of system resolution. TTF was used as a means to account for the accurate modeling of low-contrast signal transfer properties of a non-linear imaging system. This study assessed the TTF as a CT system resolution model for lesion blur in the context of reconstruction algorithm, dose, and lesion size, shape, and contrast. TTF-blurred simulated lesions were compared with CT images of corresponding physical lesions using a series of comparative tools. Amidst the presence of confounding factors, in a multiple alternative forced-choice testing paradigm (4AFC) reader study, it was found that readers performed a little better than random guessing in detecting simulated lesions at a rate of 37.9±3.1% (25% implied random guessing). The visual appearance, edge-blur, size, and shape of simulated lesions were similar to the physical lesions, which suggested 3D-TTF modeled the low-contrast signal transfer properties of this non-linear CT reasonably well.
In the second study, the TTF became a useful tool for effective implementation in lesion simulation and virtual insertion. A TTF-based lesion simulation framework was developed to model lesion’s morphology in terms of size and shape. The Lungman phantom (Kyoto, Japan) was used in the implementation of two new virtual lesion insertion methods (i.e., the projection- and image-domain virtual lesion insertion methods). A third method was also used as a benchmark which was previously developed by the U.S. Food and Drug Administration (FDA). Using these TTF-based insertion methods, TTF-blurred computer aided design (CAD) lesions were virtually inserted into phantom CT projections or reconstructed data. This study compared a series of virtually-inserted, TTF-blurred CAD lesions against a corresponding series of CT-blurred physical lesions. Pair-wise comparisons were made in terms of size and shape, yielding a 3% difference in volume and a 5% difference in shape between physical and simulated lesions. This study provided indication that the proposed lesion modeling framework could quantitatively produce realistic surrogates to real lesion.
Third, a systematic assessment of bias and variability in lesion texture feature measurement was performed across a series of clinical image acquisition settings and reconstruction algorithms. A series of CT images using three computational phantoms with anatomically-informed texture were simulated representing four in-plane pixel sizes, three slice thicknesses, three dose levels, and 33 noise and resolution models, characteristic of five commercial scanners (GE LightSpeed VCT, GE Discovery 750 HD, GE Revolution, Siemens Definition Flash, and Siemens Force). 21 statistical texture features were calculated and compared between the ground truth phantom (i.e., pre-imaging) and its corresponding post-imaging simulations. Also, each texture feature was measured with four unique volumes of interest (VOIs) sizes. Across, VOI sizes and imaging settings, the percent relative difference ranged [-97%, 1230%], and the coefficient of variation ranged [1.12%, 71.79%], between the post-imaging simulation and the ground truth. The dynamic range of results indicate that image acquisition and reconstruction conditions (i.e., in-plane pixel sizes, slice thicknesses, dose levels, and reconstruction kernels) can lead to significant bias and variability in texture feature measurements. These results indicate that reconstruction and segmentation had notable effects on the bias and variability of feature measurement, thus, underscoring the need to appropriately account for system and segmentation effects on lesion characterization.
Building on the results of the TTF validation study, techniques for virtual lesion insertion study, and the texture feature assessment study, the next three studies focused on developing and validating hybrid datasets (i.e., insertion of simulated lesions into phantom and patient CT images). The fourth study was intended to determine whether interchangeability exist between real and simulated lesions in the context of patient CT images. Virtual lesions were generated based on real patient lesions extracted from the Reference Image Database to Evaluate Therapy Response (RIDER) CT dataset and were compared with their real counterparts based on lesion size. 30 pathologically-confirmed malignancies from thoracic patient CT images were modeled. Simulated lesions were re-inserted into the original CT images using the image-domain insertion program. Four readers performed volume measurements using three commercial segmentation tools. The relative volume estimation performance of segmentation tools was done to compare measures of real lesions in actual patient CT images and simulated lesions virtually inserted into the same patient images (i.e., hybrid datasets). Direct volume comparison showed consistent trends between real and simulated lesions across all segmentation algorithms, readers, and lesion shapes. Overall, there was a 5% volumetric difference between real and simulated lesions. The results of this study add to the realization of the potential applications of virtual lesions as surrogates to real clinical lesions, not just in terms of appearance, by also quantitatively.
In a fifth study, a new approach was designed to evaluate the potential for hybrid datasets with a priori known lesion volume, to serve as a replacement to clinical images in the context of segmentation algorithm compliance with the Quantitative Imaging Biomarkers Alliance (QIBA) Profile outline. This study occurred in two phases, namely a phantom and clinical phase. The phantom phase utilized the Lungman phantom and the clinical phase utilized the same base patient images from the RIDER dataset. In the phantom, hybrid datasets were generated by virtually inserting 16 simulated lesions corresponding to physical lesions into the phantom images using the projection- and image-domain (Method 1 and Method 2) techniques from the second study, along with the FDA (Method 3) technique. For the clinical data, only Method 2 was used to insert simulated lesions corresponding to real lesions. In all, across 16 participating groups, results showed that none of the virtual insertion methods were equivalent to the physical phantom based on a 5% bias margin of tolerance. However, the magnitude of this difference was small (across all groups, 2.4%, 5.4%, and 2% for Methods 1, 2, and 3, respectively).
The final aspect of this project aimed at developing hybrid datasets for use by the wider imaging community. These were composed of anthropomorphic lung and liver lesions, embedded in thoracic and abdominal images as a means to help assess lesion characterization directly from patient images. Each dataset was outfitted with a full complement of descriptive information for each inserted lesion including lesion size, shape, texture, and contrast.
In conclusion, this dissertation provides to the scientific community a new phantom, analysis techniques, modeling tools, and datasets that can aid in appropriately evaluating lesion characterization in modern CT systems. The new techniques proposed by this dissertation offer a more clinically relevant approach to assessing the impact of CT system and segmentation/CADx algorithms on lesion characterization.
Item Open Access Imaging Polarization in Budding Yeast.(Methods Mol Biol, 2016) McClure, Allison W; Wu, Chi-Fang; Johnson, Sam A; Lew, Daniel JWe describe methods for live-cell imaging of yeast cells that we have exploited to image yeast polarity establishment. As a rare event occurring on a fast time-scale, imaging polarization involves a trade-off between spatiotemporal resolution and long-term imaging without excessive phototoxicity. By synchronizing cells in a way that increases resistance to photodamage, we discovered unexpected aspects of polarization including transient intermediates with more than one polarity cluster, oscillatory clustering of polarity factors, and mobile "wandering" polarity sites.Item Open Access Metamaterials for Computational Imaging(2013) Hunt, JohnMetamaterials extend the design space, flexibility, and control of optical material systems and so yield fundamentally new computational imaging systems. A computational imaging system relies heavily on the design of measurement modes. Metamaterials provide a great deal of control over the generation of the measurement modes of an aperture. On the other side of the coin, computational imaging uses the data that that can be measured by an imaging system, which may limited, in an optimal way thereby producing the best possible image within the physical constraints of a system. The synergy of these two technologies - metamaterials and computational imaging - allows for entirely novel imaging systems. These contributions are realized in the concept of a frequency-diverse metamaterial imaging system that will be presented in this thesis. This 'metaimager' uses the same electromagnetic flexibility that metamaterials have shown in many other contexts to construct an imaging aperture suitable for single-pixel operation that can measure arbitrary measurement modes, constrained only by the size of the aperture and resonant elements. It has no lenses, no moving parts, a small form-factor, and is low-cost.
In this thesis we present an overview of work done by the author in the area of metamaterial imaging systems. We first discuss novel transformation-optical lenses enabled by metamaterials which demonstrate the electromagnetic flexibility of metamaterials. We then introduce the theory of computational and compressed imaging using the language of Fourier optics, and derive the forward model needed to apply computational imaging to the metaimager system. We describe the details of the metamaterials used to construct the metaimager and their application to metamaterial antennas. The experimental tools needed to characterize the metaimager, including far-field and near-field antenna characterization, are described. We then describe the design, operation, and characterization of a one-dimensional metaimager capable of collecting two-dimensional images, and then a two-dimensional metaimager capable of collecting two-dimensional images. The imaging results for the one-dimensional metaimager are presented including two-dimensional (azimuth and range) images of point scatters, and video-rate imaging. The imaging results for the two-dimensional metaimager are presented including analysis of the system's resolution, signal-to-noise sensitivity, acquisition rate, human targets, and integration of optical and structured-light sensors. Finally, we discuss explorations into methods of tuning metamaterial radiators which could be employed to significantly increase the capabilities of such a metaimaging system, and describe several systems that have been designed for the integration of tuning into metamaterial imaging systems.
Item Open Access Monte Carlo Simulation of Effective Dose in Fluoroscopy and Computed Tomography Procedures(2018) Fenoli, JeffreyThe overarching goal of this project was to investigate organ dose assessment and variability using Monte Carlo methods to study two areas of medical imaging – fluoroscopy and computed tomography. Namely, these studies were intended to (1) provide estimates of the dose incurred by fluoroscopy-guided spinal injection procedures, and (2) investigate dose heterogeneity in chest and abdominopelvic computed tomography (CT) scans for a range of patient sizes. Fluoroscopy dose estimates were calculated using GEANT4, by recreating the patient procedures of six lumbar-sacral epidural injections. Computed tomography dose was estimated with a GPU-accelerated Monte Carlo package, MCGPU. Both simulations used a library of digital human (XCAT) phantoms, which were previously derived from real-patient CT scans. The fluoroscopy simulations suggest that smaller patients have a higher effective dose per dose area product, and the overall results agreed with previous experimental measurements. Variation of absorbed dose within a given organ was calculated for chest and abdominopelvic CT protocols. It was found that the 95th percentile dose can be over 11 times the mean organ dose in pediatric and adult phantoms. Furthermore, if the organ dose is calculated using only voxels within the beam or all the voxels within an organ, the result can change the result by a factor of 8. The change in dose was found to be higher for organs that have smaller fractions within the beam. Several models of tissue-weighted dose were also investigated, following similar methods to those used for effective dose. It was found that these tissue-weighted dose calculations can vary by up to 13% depending on whether the out of field dose is included. We also found that the results were not significantly affected by the pitch or projections per rotation. The results have shown that dose-volume details may be hidden by average dose estimates and suggested the need to consider intra-organ dose heterogeneity in CT dose calculations, particularly in the case of sensitive tissues (e.g., bone marrow) and populations (e.g., pediatric).
Item Open Access Novel Methods of Optical Data Analysis to Assess Radiation Responses in the Tumor Microenvironment(2013) Fontanella, Andrew NicholasThe vascular contribution to tumor radiation response is controversial, but may have profound clinical implications. This is especially true of a new class of radiation therapies which employ spatial fractionation techniques--high radiation doses delivered in a spatially modulated pattern across the tumor. Window chamber tumor models may prove useful in investigating vascular parameters due to their facilitation of non-invasive, serial measurements of living tumors. However, presently there do not exist automated and accurate algorithms capable of quantitatively analyzing window chamber data.
Here we attempt to address these two problems through (1) the generation of novel optical data processing techniques for the quantification of vascular structural and functional parameters, and (2) the application of these methods to the study of vascular radiation effects in window chamber models.
Results presented here demonstrate the versatility and functionality of the data processing methods that we have developed. In the first part of Aim 1, we have developed a vessel segmentation algorithm specifically designed for processing tumor vessels, which present a challenge to existing algorithms due to their highly branching, tortuous structure. This provides us with useful information on vascular structural parameters. In the second part of Aim 1, we demonstrate a complementary vascular functional analysis algorithm, which generates quantitative maps of speed and direction. We prove the versatility of this method by applying it to a number of different studies, including hemodynamic analysis in the dorsal window chamber, the pulmonary window, and after neural electro-stimulation. Both the structural and functional techniques are shown capable of generating accurate and unbiased vascular structural and functional information. Furthermore, that automated nature of these algorithms allow for the rapid and efficient processing of large data sets. These techniques are validated against existing techniques.
The application of these methods to the study of vascular radiation effects produced invaluable quantitative data which suggest startling tumor adaptations to radiation injury. Window chamber grown tumors were treated with either widefield, microbeam, or mock irradiation. After microbeam treatment, we observed a profound angiogenic effect within the radiation field, and no signs of vascular disruption. Upregulation of HIF-1, primarily in the tumor rim, suggested that this response may have been due to bystander mechanisms initiated by oxidative stress. This HIF-1 response may have also initiated an epithelial-mesenchymal transition in the cells of the tumor rim, as post-treatment observation revealed evidence of tumor cell mobilization and migration away from the primary tumor to form secondary satellite clusters. These data indicate the possibility of significant detrimental effects after microbeam treatment facilitated through a HIF-1 response.
Item Open Access Optimization of X-Ray Diffraction Imaging of Medical Specimens by Monte Carlo(2019) Japzon, MatthewOur research group has previously described the development and testing of a coherent-scatter spectral imaging system for identification of cancer using surrogate phantoms, formalin-fixed pathology tissues and, more recently, surgically resected breast tumor. Here we present the implementation of a Monte-Carlo simulation tool for optimization of the imaging system.
MC-GPU, a GPU-enabled Monte Carlo software was modified and used to simulate X-ray diffraction experiments for combinations of X-ray spectra (tungsten and molybdenum anode), kV (15-150), filtration (material and thickness) and phantom geometry and material (normal, adipose, fibroglandular, and cancerous breast tissue). For each combination, a simulated measurement of contrast-to-noise (CNR), signal strength and object detectability were assessed.
Examination of Monte Carlo simulations showed optimal spectrum characterization strategies that exploit spectral and filter characteristics to increase material identification probabilities via momentum transfer measurement. Increased detectability was shown with molybdenum energy spectra, and a higher CNR metric was observed to show better pathological assessments and findings of cancer.
This work demonstrates the utility of Monte Carlo methods and MCGPU in optimizing coherent scatter imaging systems and can be used to provide insightful information regarding the design of coherent scatter imaging systems for material classification breast tissue types.