Explore chapters and articles related to this topic
Virtual Experiments and Compressive Sensing for Subsurface Microwave Tomography
Published in C.H. Chen, Compressive Sensing of Earth Observations, 2017
Martina Bevacqua, Lorenzo Crocco, Loreto Di Donato, Tommaso Isernia, Roberta Palmeri
The problem (Equation 8.14) is commonly known as the basis pursuit denoising (BPDN) or least absolute shrinkage and selection operator (LASSO) problem [35]. In Equation 8.14, the minimization of the ℓ1 norm promotes the search of sparse solutions, while the constraint enforces the data consistency. In other words, among all solutions which are consistent with the measured data within a given error, the sparsest one is sought. Note that, while the optimization problem should consider indeed the so-called ℓ0 norm [18], the relaxation into ℓ1 norm adopted in Equation 8.14 reduces the problem to a convex programming one, and the two formulations are equivalent for a wide range of cases [17,36].
Sparse Recovery
Published in Angshul Majumdar, Compressed Sensing for Engineers, 2018
Such an l1 minimization problem can arise in various flavors. The one in (3.31) is called basis pursuit denoising (BPDN); in signal processing, we mostly prefer this formulation because we have some information regarding the noise in the system.
*
Published in Moeness Amin, Compressive Sensing for Urban Radar, 2017
In the presence of noise, a variation of basis pursuit called basis pursuit denoising (BPDN) (Candés et al., 2006; Chen et al., 1998) is used. BPDN solves BPDN:arg minx‖x‖l1 subject to ‖y−Φx‖l2≤∈, where ∈ > 0 is the tuning parameter chosen based on the noise level. Dantzig selector (DS) (Candés and Tao, 2007), another convex relaxation optimization introduced recently, solves the optimization problem DS:arg minx‖x‖l1 subject to ‖ΦH(y−Φx)‖l∞≤μ, where μ > 0 is the tuning parameter. When the standard deviation of the additive noise is known, and the columns of the dictionary are normalized, the parameters and μ are chosen to be 2logNσ (Chen et al., 1998). A second category of reconstruction methods perform a greedy iterative search for the solutions of P0 by making locally optimal choices. Orthogonal matching pursuit (OMP) (Tropp, 2004; Tropp and Gilbert, 2007) and compressive sampling matching pursuit (CoSaMP) (Needell and Tropp, 2009) fall under this category. The algorithm terminates when the squared error of the estimate between the consecutive iterations is below a predetermined threshold. The third category of solutions enforces a sparsity based prior on the signal x and recover the signal x by solving for the MAP estimate. Most common methods in this category are Bayesian compressive sensing (BCS) (Ji et al., 2008) and sparse Bayesian learning (SBL) (Wipf and Rao, 2004). The choice of the sparsity enforcing priors is problem dependent and there are no universal priors that guarantee good performance for all the models.
A Modified Tunable – Q Wavelet Transform Approach for Tamil Speech Enhancement
Published in IETE Journal of Research, 2022
J. Indra, R. Kiruba Shankar, N. Kasthuri, S. Geetha Manjuri
By solving the basis pursuit problem, a sparse representation of a signal can be obtained. A related approach can be applied for signal denoising where the observed signal y has been corrupted by additive noise. The task is to estimate x from the observed signal y. If x has a sparse representation with respect to a transform, then it can be estimated via sparsity based methods. One such approach is Basis Pursuit Denoising (BPD) [38] which minimizes the sum of the l1 norm of the transform coefficients and the energy of the residual: Then x can be easily estimated as TQWT−1(w). Here, w is the wavelet coefficients.