This paper examines the relationship between wavelet-based image processing algorithms and variational problems. Algorithms are derived as exact or approximate minimizers of variational problems; in particular, we show that wavelet shrinkage can be considered the exact minimizer of the following problem. Given an image F defined on a square I, minimize over all g in the Besov space B(1)(1)(L (1)(I)) the functional |F-g|(L2)(I)(2)+lambda|g|(B(1)(1 )(L(1(I)))). We use the theory of nonlinear wavelet image compression in L(2)(I) to derive accurate error bounds for noise removal through wavelet shrinkage applied to images corrupted with i.i.d., mean zero, Gaussian noise. A new signal-to-noise ratio (SNR), which we claim more accurately reflects the visual perception of noise in images, arises in this derivation. We present extensive computations that support the hypothesis that near-optimal shrinkage parameters can be derived if one knows (or can estimate) only two parameters about an image F: the largest alpha for which FinEpsilon(q)(alpha )(L(q)(I)),1/q=alpha/2+1/2, and the norm |F|B(q)(alpha)(L(q)(I)). Both theoretical and experimental results indicate that our choice of shrinkage parameters yields uniformly better results than Donoho and Johnstone's VisuShrink procedure; an example suggests, however, that Donoho and Johnstone's SureShrink method, which uses a different shrinkage parameter for each dyadic level, achieves a lower error than our procedure.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/83.661182 | DOI Listing |
Nat Commun
January 2025
Institute for Quantum Inspired and Quantum Optimization, Hamburg University of Technology, Hamburg, Germany.
Estimation of the energy of quantum many-body systems is a paradigmatic task in various research fields. In particular, efficient energy estimation may be crucial in achieving a quantum advantage for a practically relevant problem. For instance, the measurement effort poses a critical bottleneck for variational quantum algorithms.
View Article and Find Full Text PDFRev Sci Instrum
January 2025
School of Artificial Intelligence, North China University of Science and Technology, 063210 Tangshan, China.
In response to the problem of noise interference in the knock detection signal received by the pickup in the ceramic sheet knock non-destructive testing, a noise removal method is proposed based on the improved secretary bird optimization algorithm (ISBOA) optimized variational mode decomposition (VMD) combined with wavelet thresholding. First, the secretary bird optimization algorithm is improved by using the Newton-Raphson search rule and smooth exploitation variation strategy. Second, the ISBOA is used to select the key parameters in the VMD.
View Article and Find Full Text PDFJ Chem Phys
January 2025
Hylleraas Centre for Quantum Molecular Sciences, Department of Chemistry, University of Oslo, P.O. Box 1033 Blindern, N-0315 Oslo, Norway.
Traditionally, excitation energies in coupled-cluster (CC) theory have been calculated by solving the CC Jacobian eigenvalue equation. However, based on our recent work [Jørgensen et al., Sci.
View Article and Find Full Text PDFNat Commun
January 2025
Quantum Research Center, Technology Innovation Institute, Abu Dhabi, UAE.
Quantum computers hold the promise of more efficient combinatorial optimization solvers, which could be game-changing for a broad range of applications. However, a bottleneck for materializing such advantages is that, in order to challenge classical algorithms in practice, mainstream approaches require a number of qubits prohibitively large for near-term hardware. Here we introduce a variational solver for MaxCut problems over binary variables using only n qubits, with tunable k > 1.
View Article and Find Full Text PDFEntropy (Basel)
November 2024
Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan.
An information bottleneck (IB) enables the acquisition of useful representations from data by retaining necessary information while reducing unnecessary information. In its objective function, the Lagrange multiplier β controls the trade-off between retention and reduction. This study analyzes the Variational Information Bottleneck (VIB), a standard IB method in deep learning, in the settings of regression problems and derives its optimal solution.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!