Publications by authors named "Clem Karl"

Sinograms are commonly used to represent the raw data from tomographic imaging experiments. Although it is already well-known that sinograms posess some amount of redundancy, in this work, we present novel theory suggesting that sinograms will often possess substantial additional redundancies that have not been explicitly exploited by previous methods. Specifically, we derive that sinograms will often satisfy multiple simple data-dependent autoregression relationships.

View Article and Find Full Text PDF

Label-free, visible light microscopy is an indispensable tool for studying biological nanoparticles (BNPs). However, conventional imaging techniques have two major challenges: (i) weak contrast due to low-refractive-index difference with the surrounding medium and exceptionally small size and (ii) limited spatial resolution. Advances in interferometric microscopy have overcome the weak contrast limitation and enabled direct detection of BNPs, yet lateral resolution remains as a challenge in studying BNP morphology.

View Article and Find Full Text PDF

In recent years, baggage screening at airports has included the use of dual-energy X-ray computed tomography (DECT), an advanced technology for nondestructive evaluation. The main challenge remains to reliably find and identify threat objects in the bag from DECT data. This task is particularly hard due to the wide variety of objects, the high clutter, and the presence of metal, which causes streaks and shading in the scanner images.

View Article and Find Full Text PDF

Resolution improvement through signal processing techniques for integrated circuit imaging is becoming more crucial as the rapid decrease in integrated circuit dimensions continues. Although there is a significant effort to push the limits of optical resolution for backside fault analysis through the use of solid immersion lenses, higher order laser beams, and beam apodization, signal processing techniques are required for additional improvement. In this work, we propose a sparse image reconstruction framework which couples overcomplete dictionary-based representation with a physics-based forward model to improve resolution and localization accuracy in high numerical aperture confocal microscopy systems for backside optical integrated circuit analysis.

View Article and Find Full Text PDF

Efficient graph-cut methods have been used with great success for labeling and denoising problems occurring in computer vision. Unfortunately, the presence of linear image mappings has prevented the use of these techniques in most discrete-amplitude image reconstruction problems. In this paper, we develop a graph-cut based framework for the direct solution of discrete amplitude linear image reconstruction problems cast as regularized energy function minimizations.

View Article and Find Full Text PDF

The use of in vitro diagnostic devices is transitioning from the laboratory to the primary care setting to address early disease detection needs. Time critical viral diagnoses are often made without support due to the experimental time required in today's standard tests. Available rapid point of care (POC) viral tests are less reliable, requiring a follow-on confirmatory test before conclusions can be drawn.

View Article and Find Full Text PDF

An image formation framework for ultrasound imaging from synthetic transducer arrays based on sparsity-driven regularization functionals using single-frequency Fourier domain data is proposed. The framework involves the use of a physics-based forward model of the ultrasound observation process, the formulation of image formation as the solution of an associated optimization problem, and the solution of that problem through efficient numerical algorithms. The sparsity-driven, model-based approach estimates a complex-valued reflectivity field and preserves physical features in the scene while suppressing spurious artifacts.

View Article and Find Full Text PDF

Cardiac computed tomography represents an important advancement in the ability to assess coronary vessels. The accuracy of these non-invasive imaging studies is limited, however, by the presence of calcium, since calcium blooming artifacts lead to an over-estimation of the degree of luminal narrowing. To address this problem, we have developed a unified decomposition-based iterative reconstruction formulation, where different penalty functions are imposed on dense objects (i.

View Article and Find Full Text PDF

Perfusion imaging is a useful adjunct to anatomic imaging in numerous diagnostic and therapy-monitoring settings. One approach to perfusion imaging is to assume a convolution relationship between a local arterial input function and the tissue enhancement profile of the region of interest via a "residue function" and subsequently solve for this residue function. This ill-posed problem is generally solved using singular-value decomposition based approaches, and the hemodynamic parameters are solved for each voxel independently.

View Article and Find Full Text PDF

On Jupiter's moon Io, volcanic plumes and evaporating lava flows provide hot gases to form an atmosphere that is subsequently ionized. Some of Io's plasma is captured by the planet's strong magnetic field to form a co-rotating torus at Io's distance; the remaining ions and electrons form Io's ionosphere. The torus and ionosphere are also depleted by three time-variable processes that produce a banana-shaped cloud orbiting with Io, a giant nebula extending out to about 500 Jupiter radii, and a jet close to Io.

View Article and Find Full Text PDF

This article applies a unified approach to variational smoothing and segmentation to brain diffusion tensor image data along user-selected attributes derived from the tensor, with the aim of extracting detailed brain structure information. The application of this framework simultaneously segments and denoises to produce edges and smoothed regions within the white matter of the brain that are relatively homogeneous with respect to the diffusion tensor attributes of choice. This approach enables the visualization of a smoothed, scale invariant representation of the tensor data field in a variety of diverse forms.

View Article and Find Full Text PDF

Multi-detector computed tomography (MDCT) permits detection of coronary plaque. However, noise and blurring impair accuracy and precision of plaque measurements. The aim of the study was to evaluate MDCT post-processing based on non-linear image deblurring and edge-preserving noise suppression for measurements of plaque size.

View Article and Find Full Text PDF

A new approach to regularization methods for image processing is introduced and developed using as a vehicle the problem of computing dense optical flow fields in an image sequence. The solution of the new problem formulation is computed with an efficient multiscale algorithm. Experiments on several image sequences demonstrate the substantial computational savings that can be achieved due to the fact that the algorithm is noniterative and in fact has a per pixel computational complexity that is independent of image size.

View Article and Find Full Text PDF