Publications by authors named "Eric Clarkson"

This paper is part 2 of two papers that explore performing tomographic reconstructions from a space platform. A simplified model of short-wave infrared emissions in the atmosphere is given. Simulations were performed that tested the effectiveness of reconstructions given signal amplitude, frequency, signal-to-noise ratio, number of iterations run, and others.

View Article and Find Full Text PDF

For imaging instruments that are in space looking toward the Earth, there are a variety of nuisance signals that can get in the way of performing certain imaging tasks, such as reflections from clouds, reflections from the ground, and emissions from the OH-airglow layer. A method for separating these signals is to perform tomographic reconstructions from the collected data. A lingering struggle for this method is altitude-axis resolution and different methods for helping with it are discussed.

View Article and Find Full Text PDF

There are two types of uncertainty in image reconstructions from list-mode data: statistical and deterministic. One source of statistical uncertainty is the finite number of attributes of the detected particles, which are sampled from a probability distribution on the attribute space. A deterministic source of uncertainty is the effect that null functions of the imaging operator have on reconstructed pixel or voxel values.

View Article and Find Full Text PDF

Purpose: The goal is to provide a sufficient condition for the invertibility of a multi-energy (ME) X-ray transform. The energy-dependent X-ray attenuation profiles can be represented by a set of coefficients using the Alvarez-Macovski (AM) method. An ME X-ray transform is a mapping from AM coefficients to noise-free energy-weighted measurements, where .

View Article and Find Full Text PDF

An upper bound is derived for a figure of merit that quantifies the error in reconstructed pixel or voxel values induced by the presence of null functions for any list-mode system. It is shown that this upper bound decreases as the region in attribute space occupied by the allowable attribute vectors expands. This upper bound allows quantification of the reduction in this error when this type of expansion is implemented.

View Article and Find Full Text PDF

The potential to perform attenuation and scatter compensation (ASC) in single-photon emission computed tomography (SPECT) imaging without a separate transmission scan is highly significant. In this context, attenuation in SPECT is primarily due to Compton scattering, where the probability of Compton scatter is proportional to the attenuation coefficient of the tissue and the energy of the scattered photon and the scattering angle are related. Based on this premise, we investigated whether the SPECT scattered-photon data acquired in list-mode (LM) format and including the energy information can be used to estimate the attenuation map.

View Article and Find Full Text PDF

List-mode data are increasingly being used in single photon emission computed tomography (SPECT) and positron emission tomography (PET) imaging, among other imaging modalities. However, there are still many imaging designs that effectively bin list-mode data before image reconstruction or other estimation tasks are performed. Intuitively, the binning operation should result in a loss of information.

View Article and Find Full Text PDF

The van Trees inequality relates the ensemble mean squared error of an estimator to a Bayesian version of the Fisher information. The Ziv-Zakai inequality relates the ensemble mean squared error of an estimator to the minimum probability of error for the task of detecting a change in the parameter. In this work we complete this circle by deriving an inequality that relates this minimum probability of error to the Bayesian version of the Fisher information.

View Article and Find Full Text PDF

We derive a connection between the performance of statistical estimators and the performance of the ideal observer on related detection tasks. Specifically, we show how the task-specific Shannon information for the task of detecting a change in a parameter is related to the Fisher information and to the Bayesian Fisher information. We have previously shown that this Shannon information is related via an integral transform to the minimum probability of error on the same task.

View Article and Find Full Text PDF

Previously published work on joint estimation/detection tasks has focused on the area under the estimation receiver operating characteristic (EROC) curve as a figure of merit (FOM) for these tasks in imaging. Another FOM for these joint tasks is the Bayesian risk, where a cost is assigned to all detection outcomes and to the estimation errors, and then averaged over all sources of randomness in the object ensemble and the imaging system. Important elements of the cost function, which are not included in standard EROC analysis, are that the cost for a false positive depends on the estimate produced for the parameter vector, and the cost for a false negative depends on the true value of the parameter vector.

View Article and Find Full Text PDF

Many different physiological processes affect the growth of malignant lesions and their response to therapy. Each of these processes is spatially and genetically heterogeneous; dynamically evolving in time; controlled by many other physiological processes, and intrinsically random and unpredictable. The objective of this paper is to show that all of these properties of cancer physiology can be treated in a unified, mathematically rigorous way via the theory of random processes.

View Article and Find Full Text PDF

A method for optimization of an adaptive Single Photon Emission Computed Tomography (SPECT) system is presented. Adaptive imaging systems can quickly change their hardware configuration in response to data being generated in order to improve image quality for a specific task. In this work we simulate an adaptive SPECT system and propose a method for finding the adaptation that maximizes the performance on a signal estimation task.

View Article and Find Full Text PDF

Characteristic functionals are one of the main analytical tools used to quantify the statistical properties of random fields and generalized random fields. The viewpoint taken here is that a random field is the correct model for the ensemble of objects being imaged by a given imaging system. In modern digital imaging systems, random fields are not used to model the reconstructed images themselves since these are necessarily finite dimensional.

View Article and Find Full Text PDF

We present a new method for computing optimized channels for estimation tasks that is feasible for high-dimensional image data. Maximum-likelihood (ML) parameter estimates are challenging to compute from high-dimensional likelihoods. The dimensionality reduction from M measurements to L channels is a critical advantage of channelized quadratic estimators (CQEs), since estimating likelihood moments from channelized data requires smaller sample sizes and inverting a smaller covariance matrix is easier.

View Article and Find Full Text PDF

We show how Shannon information is mathematically related to receiver operating characteristic (ROC) analysis for multiclass classification problems in imaging. In particular, the minimum probability of error for the ideal observer, as a function of the prior probabilities for each class, determines the Shannon Information for the classification task, also considered as a function of the prior probabilities on the classes. In the process, we show how an ROC hypersurface that has been studied by other researchers is mathematically related to a Shannon information ROC (SIROC) hypersurface.

View Article and Find Full Text PDF

Shannon information is defined for imaging tasks where signal detection is combined with parameter estimation. The first task considered is when the parameters are associated with the signal and parameter estimates are only produced when the signal is present. The second task examined is when the parameters are associated with the object being imaged, and parameter estimates are produced whether the signal is present or not.

View Article and Find Full Text PDF

The Fano factor of an integer-valued random variable is defined as the ratio of its variance to its mean. Correlation between the outputs of two photomultiplier tubes on opposite faces of a scintillation crystal was used to estimate the Fano factor of photoelectrons and scintillation photons. Correlations between the integrals of the detector outputs were used to estimate the photoelectron and photon Fano factor for YAP:Ce, SrI:Eu and CsI:Na scintillator crystals.

View Article and Find Full Text PDF
Article Synopsis
  • The Fano factor measures the noise of an integer-valued random variable by comparing its variance to its mean, influencing the performance of scintillation crystals in detecting light.
  • Variations in Fano factors among scintillation crystals range from less noise (sub-Poisson) to more noise (super-Poisson), impacting the quality of spatial and energy resolutions in gamma-camera systems.
  • The study found that while the Fano factor does not affect the spatial resolution when determining the position of gamma-ray photon interactions, a smaller Fano factor leads to improved energy resolution.
View Article and Find Full Text PDF

Shannon information (SI) and the ideal-observer receiver operating characteristic (ROC) curve are two different methods for analyzing the performance of an imaging system for a binary classification task, such as the detection of a variable signal embedded within a random background. In this work we describe a new ROC curve, the Shannon information receiver operator curve (SIROC), that is derived from the SI expression for a binary classification task. We then show that the ideal-observer ROC curve and the SIROC have many properties in common, and are equivalent descriptions of the optimal performance of an observer on the task.

View Article and Find Full Text PDF

Recent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use real-time maximum-likelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photon-interaction event and store these attributes in a list format. This class of systems, which we refer to as photon-processing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuous-valued photon attributes on a per-photon basis. Unlike conventional photon-counting (PC) systems that bin the data into images, PP systems do not have any binning-related information loss.

View Article and Find Full Text PDF

During the past two decades, researchers at the University of Arizona's Center for Gamma-Ray Imaging (CGRI) have explored a variety of approaches to gamma-ray detection, including scintillation cameras, solid-state detectors, and hybrids such as the intensified Quantum Imaging Device (iQID) configuration where a scintillator is followed by optical gain and a fast CCD or CMOS camera. We have combined these detectors with a variety of collimation schemes, including single and multiple pinholes, parallel-hole collimators, synthetic apertures, and anamorphic crossed slits, to build a large number of preclinical molecular-imaging systems that perform Single-Photon Emission Computed Tomography (SPECT), Positron Emission Tomography (PET), and X-Ray Computed Tomography (CT). In this paper, we discuss the themes and methods we have developed over the years to record and fully use the information content carried by every detected gamma-ray photon.

View Article and Find Full Text PDF

To extend our understanding of tear film dynamics for the management of dry eye disease, we propose a method to optically sense the tear film and estimate simultaneously the thicknesses of the lipid and aqueous layers. The proposed method, SDT-OCT, combines ultra-high axial resolution optical coherence tomography (OCT) and a robust estimator based on statistical decision theory (SDT) to achieve thickness measurements at the nanometer scale. Unlike conventional Fourier-domain OCT where peak detection of layers occurs in Fourier space, in SDT-OCT thickness is estimated using statistical decision theory directly on the raw spectra acquired with the OCT system.

View Article and Find Full Text PDF

Purpose: T2 mapping provides a quantitative approach for focal liver lesion characterization. For small lesions, a biexponential model should be used to account for partial volume effects (PVE). However, conventional biexponential fitting suffers from large uncertainty of the fitted parameters when noise is present.

View Article and Find Full Text PDF

With the emergence of diffuse optical tomography (DOT) as a non-invasive imaging modality, there is a requirement to evaluate the performance of the developed DOT systems on clinically relevant tasks. One such important task is the detection of high-absorption signals in the tissue. To investigate signal detectability in DOT systems for system optimization, an appropriate approach is to use the Bayesian ideal observer, but this observer is computationally very intensive.

View Article and Find Full Text PDF

Understanding tear film dynamics is a prerequisite for advancing the management of Dry Eye Disease (DED). In this paper, we discuss the use of optical coherence tomography (OCT) and statistical decision theory to analyze the tear film dynamics of a digital phantom. We implement a maximum-likelihood (ML) estimator to interpret OCT data based on mathematical models of Fourier-Domain OCT and the tear film.

View Article and Find Full Text PDF