## Abstract

We study the complexity of learning and approximation of self-bounding functions over the uniform distribution on the Boolean hypercube {0,1}^{n}. Informally, a function f:{0,1}^{n}→R is self-bounding if for every x∈{0,1}^{n}, f(x) upper bounds the sum of all the n marginal decreases in the value of the function at x. Self-bounding functions include such well-known classes of functions as submodular and fractionally-subadditive (XOS) functions. They were introduced by Boucheron et al. (2010) in the context of concentration of measure inequalities. Our main result is a nearly tight ℓ_{1}-approximation of self-bounding functions by low-degree juntas. Specifically, all self-bounding functions can be ϵ-approximated in ℓ_{1} by a polynomial of degree O˜(1/ϵ) over 2^{O˜(1/ϵ)} variables. We show that both the degree and junta-size are optimal up to logarithmic terms. Previous techniques considered stronger ℓ_{2} approximation and proved nearly tight bounds of Θ(1/ϵ^{2}) on the degree and 2^{Θ(1/ϵ2)} on the number of variables. Our bounds rely on the analysis of noise stability of self-bounding functions together with a stronger connection between noise stability and ℓ_{1} approximation by low-degree polynomials. This technique can also be used to get tighter bounds on ℓ_{1} approximation by low-degree polynomials and a faster learning algorithm for halfspaces. These results lead to improved and in several cases almost tight bounds for PAC and agnostic learning of self-bounding functions relative to the uniform distribution. In particular, assuming hardness of learning juntas, we show that PAC and agnostic learning of self-bounding functions have complexity of n^{Θ˜(1/ϵ)}.

Original language | English (US) |
---|---|

Pages (from-to) | 86-98 |

Number of pages | 13 |

Journal | Theoretical Computer Science |

Volume | 808 |

DOIs | |

State | Published - Feb 12 2020 |

Externally published | Yes |

## All Science Journal Classification (ASJC) codes

- Theoretical Computer Science
- General Computer Science

## Keywords

- Fourier analysis
- Noise stability
- PAC learning
- Polynomial approximation
- Submodular function
- XOS function

## Fingerprint

Dive into the research topics of 'Tight bounds on ℓ_{1}approximation and learning of self-bounding functions'. Together they form a unique fingerprint.