### Abstract

We show that several classical quantities controlling compressed-sensing performance directly match classical parameters controlling algorithmic complexity. We first describe linearly convergent restart schemes on first-order methods solving a broad range of compressed-sensing problems, where sharpness at the optimum controls convergence speed. We show that for sparse recovery problems, this sharpness can be written as a condition number, given by the ratio between true signal sparsity and the largest signal size that can be recovered by the observation matrix. In a similar vein, Renegar's condition number is a data-driven complexity measure for convex programmes, generalizing classical condition numbers for linear systems. We show that for a broad class of compressed-sensing problems, the worst case value of this algorithmic complexity measure taken over all signals matches the restricted singular value of the observation matrix which controls robust recovery performance. Overall, this means in both cases that, in compressed-sensing problems, a single parameter directly controls both computational complexity and recovery performance. Numerical experiments illustrate these points using several classical algorithms.

Original language | English (US) |
---|---|

Pages (from-to) | 1-32 |

Number of pages | 32 |

Journal | Information and Inference |

Volume | 9 |

Issue number | 1 |

DOIs | |

State | Published - Mar 1 2020 |

### All Science Journal Classification (ASJC) codes

- Computational Theory and Mathematics
- Analysis
- Applied Mathematics
- Statistics and Probability
- Numerical Analysis

### Keywords

- Error bounds
- Renegar's condition number
- Restart
- Sharpness
- Sparse recovery

## Fingerprint Dive into the research topics of 'Computational complexity versus statistical performance on sparse recovery problems'. Together they form a unique fingerprint.

## Cite this

*Information and Inference*,

*9*(1), 1-32. https://doi.org/10.1093/imaiai/iay020