Nullspace property

In compressed sensing, the nullspace property gives necessary and sufficient conditions on the reconstruction of sparse signals using the techniques of $\ell_1$-relaxation. The term "nullspace property" originates from Cohen, Dahmen, and DeVore. The nullspace property is often difficult to check in practice, and the restricted isometry property is a more modern condition in the field of compressed sensing.

The technique of $\ell_1$-relaxation
The non-convex $$\ell_0$$-minimization problem,

$$\min\limits_{x} \|x\|_0$$ subject to $$Ax = b$$,

is a standard problem in compressed sensing. However, $$\ell_0$$-minimization is known to be NP-hard in general. As such, the technique of $\ell_1$-relaxation is sometimes employed to circumvent the difficulties of signal reconstruction using the $$\ell_0$$-norm. In $\ell_1$-relaxation, the $\ell_1$ problem,

$$\min\limits_{x} \|x\|_1$$ subject to $$Ax = b$$,

is solved in place of the $$\ell_0$$ problem. Note that this relaxation is convex and hence amenable to the standard techniques of linear programming - a computationally desirable feature. Naturally we wish to know when $\ell_1$-relaxation will give the same answer as the $$\ell_0$$ problem. The nullspace property is one way to guarantee agreement.

Definition
An $$m \times n$$ complex matrix $$A$$ has the nullspace property of order $$s$$, if for all index sets $$S$$ with $$s=|S| \leq n$$ we have that: $$\|\eta_S\|_1 < \|\eta_{S^C}\|_1$$  for all $$\eta \in \ker{A} \setminus \left\{0\right\}$$.

Recovery Condition
The following theorem gives necessary and sufficient condition on the recoverability of a given $$s$$-sparse vector in $$\mathbb{C}^n$$. The proof of the theorem is a standard one, and the proof supplied here is summarized from Holger Rauhut.

$$\textbf{Theorem:}$$ Let $$A$$ be a  $$m \times n$$ complex matrix. Then every $$s$$-sparse signal $$x \in \mathbb{C}^n$$ is the unique solution to the $$\ell_1$$-relaxation problem with $$b = Ax$$ if and only if $$A$$ satisfies the nullspace property with order $$s$$.

$$\textit{Proof:}$$ For the forwards direction notice that $$\eta_S$$ and $$-\eta_{S^C}$$ are distinct vectors with $$A(-\eta_{S^C}) = A(\eta_S)$$ by the linearity of $$A$$, and hence by uniqueness we must have $$\|\eta_S\|_1 < \|\eta_{S^C}\|_1$$ as desired. For the backwards direction, let $$x$$ be $$s$$-sparse and $$z$$ another (not necessary $$s$$-sparse) vector such that $$z \neq x$$ and $$Az = Ax$$. Define the (non-zero) vector $$\eta = x - z$$ and notice that it lies in the nullspace of $$A$$. Call $$S$$ the support of $$x$$, and then the result follows from an elementary application of the triangle inequality: $$\|x\|_1 \leq \|x - z_S\|_1 + \|z_S\|_1 = \|\eta_S\|_1+\|z_S\|_1 < \|\eta_{S^C}\|_1+\|z_S\|_1 = \|-z_{S^C}\|_1+\|z_S\|_1 = \|z\|_1$$, establishing the minimality of $$x$$. $$\square$$