## Restricted Eigenvalue Properties for Correlated Gaussian Designs

** Garvesh Raskutti, Martin J. Wainwright, Bin Yu**; 11(78):2241−2259, 2010.

### Abstract

Methods based on *l _{1}*-relaxation, such as basis pursuit and the
Lasso, are very popular for sparse regression in high dimensions. The
conditions for success of these methods are now well-understood: (1)
exact recovery in the noiseless setting is possible if and only if the
design matrix

*X*satisfies the restricted nullspace property, and (2) the squared

*l*-error of a Lasso estimate decays at the minimax optimal rate

_{2}*k log p / n*, where

*k*is the sparsity of the

*p*-dimensional regression problem with additive Gaussian noise, whenever the design satisfies a restricted eigenvalue condition. The key issue is thus to determine when the design matrix

*X*satisfies these desirable properties. Thus far, there have been numerous results showing that the restricted isometry property, which implies both the restricted nullspace and eigenvalue conditions, is satisfied when all entries of

*X*are independent and identically distributed (i.i.d.), or the rows are unitary. This paper proves directly that the restricted nullspace and eigenvalue conditions hold with high probability for quite general classes of Gaussian matrices for which the predictors may be highly dependent, and hence restricted isometry conditions can be violated with high probability. In this way, our results extend the attractive theoretical guarantees on

*l*-relaxations to a much broader class of problems than the case of completely independent or unitary designs.

_{1}© JMLR 2010. (edit, beta) |