Minimization of Transformed $L_1$ Penalty: Theory, Difference of Convex Function Algorithm, and Robust Application in Compressed Sensing.
We study the minimization problem of a non-convex sparsity promoting penalty function, the transformed $l_1$ (TL1), and its application in compressed sensing (CS). The TL1 penalty interpolates $l_0$ and $l_1$ norms through a nonnegative parameter $a \in (0,+\infty)$, similar to $l_p$ with $p \in (0,1]$, and is known to satisfy unbiasedness, sparsity and Lipschitz continuity properties. We first consider the constrained minimization problem and discuss the exact recovery of $l_0$ norm minimal solution based on the null space property (NSP). We then prove the stable recovery of $l_0$ norm minimal solution if the sensing matrix $A$ satisfies a restricted isometry property (RIP). Next, we present difference of convex algorithms for TL1 (DCATL1) in computing TL1-regularized constrained and unconstrained problems in CS. The inner loop concerns an $l_1$ minimization problem on which we employ the Alternating Direction Method of Multipliers (ADMM). For the unconstrained problem, we prove convergence of DCATL1 to a stationary point satisfying the first order optimality condition. In numerical experiments, we identify the optimal value $a=1$, and compare DCATL1 with other CS algorithms on two classes of sensing matrices: Gaussian random matrices and over-sampled discrete cosine transform matrices (DCT). We find that for both classes of sensing matrices, the performance of DCATL1 algorithm (initiated with $l_1$ minimization) always ranks near the top (if not the top), and is the most robust choice insensitive to the conditioning of the sensing matrix $A$. DCATL1 is also competitive in comparison with DCA on other non-convex penalty functions commonly used in statistics with two hyperparameters.
Publisher URL: http://arxiv.org/abs/1411.5735