## Abstract

We study the complexity of the entire regularization path for least squares regression with 1 -norm penalty, known as the Lasso. Every regression parameter in the Lasso changes linearly as a function of the regularization value. The number of changes is regarded as the Lasso's complexity. Experimental results using exact path following exhibit polynomial complexity of the Lasso in the problem size. Alas, the path complexity of the Lasso on artificially designed regression problems is exponential. We use smoothed analysis as a mechanism for bridging the gap between worst case settings and the de facto low complexity. Our analysis assumes that the observed data has a tiny amount of intrinsic noise. We then prove that the Lasso's complexity is polynomial in the problem size.

Original language | English (US) |
---|---|

Title of host publication | 35th International Conference on Machine Learning, ICML 2018 |

Editors | Jennifer Dy, Andreas Krause |

Publisher | International Machine Learning Society (IMLS) |

Pages | 4720-4728 |

Number of pages | 9 |

Volume | 7 |

ISBN (Electronic) | 9781510867963 |

State | Published - Jan 1 2018 |

Externally published | Yes |

Event | 35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden Duration: Jul 10 2018 → Jul 15 2018 |

### Other

Other | 35th International Conference on Machine Learning, ICML 2018 |
---|---|

Country | Sweden |

City | Stockholm |

Period | 7/10/18 → 7/15/18 |

## All Science Journal Classification (ASJC) codes

- Computational Theory and Mathematics
- Human-Computer Interaction
- Software