TY - JOUR
T1 - Representation formulas and pointwise properties for Barron functions
AU - Weinan, E.
AU - Wojtowytsch, Stephan
N1 - Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.
PY - 2022/4
Y1 - 2022/4
N2 - We study the natural function space for infinitely wide two-layer neural networks with ReLU activation (Barron space) and establish different representation formulae. In two cases, we describe the space explicitly up to isomorphism. Using a convenient representation, we study the pointwise properties of two-layer networks and show that functions whose singular set is fractal or curved (for example distance functions from smooth submanifolds) cannot be represented by infinitely wide two-layer networks with finite path-norm. We use this structure theorem to show that the only C1-diffeomorphisms which preserve Barron space are affine. Furthermore, we show that every Barron function can be decomposed as the sum of a bounded and a positively one-homogeneous function and that there exist Barron functions which decay rapidly at infinity and are globally Lebesgue-integrable. This result suggests that two-layer neural networks may be able to approximate a greater variety of functions than commonly believed.
AB - We study the natural function space for infinitely wide two-layer neural networks with ReLU activation (Barron space) and establish different representation formulae. In two cases, we describe the space explicitly up to isomorphism. Using a convenient representation, we study the pointwise properties of two-layer networks and show that functions whose singular set is fractal or curved (for example distance functions from smooth submanifolds) cannot be represented by infinitely wide two-layer networks with finite path-norm. We use this structure theorem to show that the only C1-diffeomorphisms which preserve Barron space are affine. Furthermore, we show that every Barron function can be decomposed as the sum of a bounded and a positively one-homogeneous function and that there exist Barron functions which decay rapidly at infinity and are globally Lebesgue-integrable. This result suggests that two-layer neural networks may be able to approximate a greater variety of functions than commonly believed.
UR - http://www.scopus.com/inward/record.url?scp=85124417346&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85124417346&partnerID=8YFLogxK
U2 - 10.1007/s00526-021-02156-6
DO - 10.1007/s00526-021-02156-6
M3 - Article
AN - SCOPUS:85124417346
SN - 0944-2669
VL - 61
JO - Calculus of Variations and Partial Differential Equations
JF - Calculus of Variations and Partial Differential Equations
IS - 2
M1 - 46
ER -