Necessary and Sufficient Conditions on Sparsity Pattern Recovery - Computer Science > Information TheoryReport as inadecuate




Necessary and Sufficient Conditions on Sparsity Pattern Recovery - Computer Science > Information Theory - Download this document for free, or read online. Document in PDF available to download.

Abstract: The problem of detecting the sparsity pattern of a k-sparse vector in R^nfrom m random noisy measurements is of interest in many areas such as systemidentification, denoising, pattern recognition, and compressed sensing. Thispaper addresses the scaling of the number of measurements m, with signaldimension n and sparsity-level nonzeros k, for asymptotically-reliabledetection. We show a necessary condition for perfect recovery at any given SNRfor all algorithms, regardless of complexity, is m = Omegak logn-kmeasurements. Conversely, it is shown that this scaling of Omegak logn-kmeasurements is sufficient for a remarkably simple ``maximum correlation-estimator. Hence this scaling is optimal and does not require moresophisticated techniques such as lasso or matching pursuit. The constants forboth the necessary and sufficient conditions are precisely defined in terms ofthe minimum-to-average ratio of the nonzero components and the SNR. Thenecessary condition improves upon previous results for maximum likelihoodestimation. For lasso, it also provides a necessary condition at any SNR andfor low SNR improves upon previous work. The sufficient condition provides thefirst asymptotically-reliable detection guarantee at finite SNR.



Author: Alyson K. Fletcher, Sundeep Rangan, Vivek K. Goyal

Source: https://arxiv.org/







Related documents