Browsing by Author "Gehm, Michael Eric"
- Results Per Page
- Sort Options
Item Open Access An Information-Theoretic Analysis of X-Ray Architectures for Anomaly Detection(2018) Coccarelli, David ScottX-ray scanning equipment currently establishes a first line of defense in the aviation security space. The efficacy of these scanners is crucial to preventing the harmful use of threatening objects and materials. In this dissertation, I introduce a principled approach to the analyses of these systems by exploring performance limits of system architectures and modalities. Moreover, I validate the use of simulation as a design tool with experimental data as well as extend the use of simulation to create high-fidelity realizations of a real-world system measurements.
Conventional performance analysis of detection systems confounds the effects of the system architecture (sources, detectors, system geometry, etc.) with the effects of the detection algorithm. We disentangle the performance of the system hardware and detection algorithm so as to focus on analyzing the performance of just the system hardware. To accomplish this, we introduce an information-theoretic approach to this problem. This approach is based on a metric derived from Cauchy-Schwarz mutual information and is analogous to the channel capacity concept from communications engineering. We develop and utilize a framework that can produce thousands of system simulations representative of a notional baggage ensembles. These simulations and the prior knowledge of the virtual baggage allow us to analyze the system as it relays information pertinent to a detection task.
In this dissertation, I discuss the application of this information-theoretic approach to study variations of X-ray transmission architectures as well as novel screening systems based on X-ray scatter and phase. The results show how effective use of this metric can impact design decisions for X-ray systems. Moreover, I introduce a database of experimentally acquired X-ray data both as a means to validate the simulation approach and to produce a database ripe for further reconstruction and classification investigations. Next, I show the implementation of improvements to the ensemble representation in the information-theoretic material model. Finally I extend the simulation tool toward high-fidelity representation of real-world deployed systems.
Item Open Access Charged Particle Optics Simulations and Optimizations for Miniature Mass and Energy Spectrometers(2021) DiDona, ShaneComputer simulation and modeling is a powerful tool for the analysis of physical systems; in this work we consider the use of computer modeling and optimization in improving the focusing properties of a variety of charged particle optics systems. The combined use of several software packages and custom computer code allows for modeling electrostatic and magnetostatic fields and the trajectory of particles through them. Several applications of this functionality are shown. The pieces of code which are shown are the starting point of an integrated charged particle simulation and optimization tool with focus on optimization. The applications shown are mass spectrographs and electron energy spectrographs. Simulation allowed additional information about the systems in question to be determined.In this work, coded apertures are shown to be compatible with sector instruments, though architectural challenges exist. Next, simulation allowed for the discovery of a new class of mass spectrograph which addresses these challenges and is compatible with computational sensing, allowing for both high resolution and high sensitivity, with a 1.8x improvement in spot size. Finally, a portion of this new spectrograph was adapted for use as an electron energy spectrograph, with a resolution 9.1x and energy bandwidth 2.1x that of traditional instruments.
Item Open Access Coded Memory Effect Imaging(2019) Li, XiaohanImaging through scattering media is of great interest and has important applications in many fields such as biological, medical, and astronomical imaging. However performing imaging with scattered light is challenging because the complex and random modulation imposed upon the light by the scatterer. In this dissertation, I introduce a new techinuqe to reconstruct an object hidden behind a scattering media. I demonstrate experimentally that the temporal or spectral information of the object can be recovered from a single measurement. More important, the reconstruction process does not rely on the prior knowledge or access to the scattering media.
Conventional imaging methods such as wavefront shaping and adptive optics have been developed to conquer the imaging through scattering media challenge. However those approaches usually require access to the scattering media, which is impractical in many imaging scenarios. Meanwhile the memory effect (ME) imaging is capable of recovering the object from one single shot of random speckle measurement without access to the scattering media when the size of the object is within the memory effect range of the scatterer. However, memory effect imaging techniques have been limited to static and grayscale imaging, therefore a tremendous amount of information of the light is wasted. To overcome this disadvantage I introduce coding and compressed sensing to realize snapshot imaging through scattering media.
In this dissertation I present the technique details of the single shot non-invasive method for imaging through scattering media. Optical implementations and experimental demonstrations of various cases such as dynamic object through dynamic/static diffuser and multi-spectral (discrete and continuous) object are provided in different chapters. The advantage of our technique such as high performance in an SNR-limited environment and high spectral resolution (comparing with the state-of-art method) are also introduced along with the experimental demonstrations.
Item Open Access Compressive Sensing in Transmission Electron Microscopy(2018) Stevens, AndrewElectron microscopy is one of the most powerful tools available in observational science. Magnifications of 10,000,000x have been achieved with picometer precision. At this high level of magnification, individual atoms are visible. This is possible because the wavelength of electrons is much smaller than visible light, which also means that the highly focused electron beams used to perform imaging contain significantly more energy than visible light. The beam energy is high enough that it can cause radiation damage to metal specimens. Reducing radiation dose while maintaining image quality has been a central research topic in electron microscopy for several decades. Without the ability to reduce the dose, most organic and biological specimens cannot be imaged at atomic resolution. Fundamental processes in materials science and biology arise at the atomic level, thus understanding these processes can only occur if the observational tools can capture information with atomic resolution.
The primary objective of this research is to develop new techniques for low dose and high resolution imaging in (scanning) transmission electron microscopy (S/TEM). This is achieved through the development of new machine learning based compressive sensing algorithms and microscope hardware for acquiring a subset of the pixels in an image. Compressive sensing allows recovery of a signal from significantly fewer measurements than total signal size (under certain conditions). The research objective is attained by demonstrating application of compressive sensing to S/TEM in several simulations and real microscope experiments. The data types considered are images, videos, multispectral images, tomograms, and 4-dimensional ptychographic data. In the simulations, image quality and error metrics are defined to verify that reducing dose is possible with a small impact on image quality. In the microscope experiments, images are acquired with and without compressive sensing so that a qualitative verification can be performed.
Compressive sensing is shown to be an effective approach to reduce dose in S/TEM without sacrificing image quality. Moreover, it offers increased acquisition speed and reduced data size. Research leading to this dissertation has been published in 25 articles or conference papers and 5 patent applications have been submitted. The published papers include contributions to machine learning, physics, chemistry, and materials science. The newly developed pixel sampling hardware is being productized so that other microscopists can use compressive sensing in their experiments. In the future, scientific imaging devices (e.g., scanning transmission x-ray microscopy (STXM) and secondary-ion mass spectrometry (SIMS)) could also benefit from the techniques presented in this dissertation.
Item Open Access High-rate Modeling and Computing for Optical Systems---Gigapixel Image Formation and X-ray Imaging Physics(2017) Gong, Qian GongWith the rapid development of computational sensing technologies, the volume of available sensing data has been increasing daily as sensor systems grow in scale. This is sometimes referred to as the "data deluge". Many physical computing applications have to spend great effort on meeting the challenges of this environment, which has prompted a need for rapid and efficient processing of massive datasets. Fortunately, many algorithms used in these applications can be decomposed and partially or fully cast into a parallel computing framework. This dissertation discusses three sensing models---gigapixel image formation, X-ray transmission and X-ray scattering---and proposes methods to formulate each task as a scalable and distributed problem which is adapted to the massively parallel architecture of Graphics Processing Units (GPUs).
For the gigapixel images, this dissertation presents a scalable and flexible image formation pipeline based on the MapReduce framework. The presented implementation was developed to operate on the AWARE multiscale cameras, which consist of microcamera arrays imaging through a shared hemispherical objective. The microcamera field-of-views slightly overlap and are capable of generating high-resolution and high dynamic range panoramic images and videos. The proposed GPU implementation takes advantage of the prior knowledge regarding the alignment between microcameras and exploits the multiscale nature of the AWARE image acquisition, enabling the rapid composition of panoramas ranging from display-scale views to gigapixel-scale full resolution images. On a desktop computer, a 1.6-gigapixel color panorama captured by the AWARE-10 can be delivered in less than a minute, while 720p and 1080p panoramas can be stitched at the video frame rate.
We next present a pipeline that rapidly simulates X-ray transmission imaging via ray-tracing on GPU. This pipeline was initially designed for statistical analysis of X-ray threat detection in the context of aviation baggage inspection, but it could also be applied in the modeling of other non-destructive X-ray detection systems. X-ray transmission measurements are simulated based on Beer's law. The highly-optimized OptiX API is used to implement ray-tracing, greatly speeding code execution. Moreover, we use a hierarchical representation structure to determine the interaction path length of rays traversing heterogeneous media described by layered polygons. The validity of the pipeline was verified by comparing simulated data with experimental data collected using a Delrin phantom and a laboratory X-ray imaging system. On a single computer, 400 transmission projections (125 by 125 pixels per frame) of a bag packed with hundreds of everyday objects can be generated via our simulation tool in an hour, compared to thousands of hours needed by CPU-based MC approaches. Further speed improvements have been achieved by moving the computations to a cloud-based GPU computing platform.
Finally, we describe a high-throughput simulation algorithm for X-ray scatter based on a deterministic but sampled approach built upon the previously described GPU-centric ray-tracing framework. Compared to Monte Carlo and Monte Carlo-based hybrid methods our approach is orders of magnitude faster and (in contrast to the deterministic method) allowing for modeling of scatter radiation in arbitrary imaging configurations and to any order. Qualitative and semi-quantitative validation have been conducted by comparing data obtained with the simulated pipeline and a laboratory X-ray scattering system. As for the speed of execution, on a single computer, a scatter image (125 by 125 pixels) of a simple 3D shape collected in a pencil beam geometry can be generated in minutes, while a realistic bag model collected in a fan-beam geometry takes about an hour.