TY - GEN
T1 - Evolving transferable neural pruning functions
AU - Liu, Yuchen
AU - Kung, S. Y.
AU - Wentzlaff, David
N1 - Publisher Copyright:
© 2022 Owner/Author.
PY - 2022/7/8
Y1 - 2022/7/8
N2 - Structural design of neural networks is crucial for the success of deep learning. While most prior works in evolutionary learning aim at directly searching the structure of a network, few attempts have been made on another promising track, channel pruning, which recently has made major headway in designing efficient deep learning models. In fact, prior pruning methods adopt human-made pruning functions to score a channel's importance for channel pruning, which requires domain knowledge and could be sub-optimal. To this end, we pioneer the use of genetic programming (GP) to discover strong pruning metrics automatically. Specifically, we craft a novel design space to express high-quality and transferable pruning functions, which ensures an end-to-end evolution process where no manual modification is needed on the evolved functions for their transferability after evolution. Unlike prior methods, our approach can provide both compact pruned networks for efficient inference and novel closed-form pruning metrics which are mathematically explainable and thus generalizable to different pruning tasks. While the evolution is conducted on small datasets, our functions shows promising results when applied to more challenging datasets, different from those used in the evolution process. For example, on ILSVRC-2012, an evolved function achieves state-of-the-art pruning results.
AB - Structural design of neural networks is crucial for the success of deep learning. While most prior works in evolutionary learning aim at directly searching the structure of a network, few attempts have been made on another promising track, channel pruning, which recently has made major headway in designing efficient deep learning models. In fact, prior pruning methods adopt human-made pruning functions to score a channel's importance for channel pruning, which requires domain knowledge and could be sub-optimal. To this end, we pioneer the use of genetic programming (GP) to discover strong pruning metrics automatically. Specifically, we craft a novel design space to express high-quality and transferable pruning functions, which ensures an end-to-end evolution process where no manual modification is needed on the evolved functions for their transferability after evolution. Unlike prior methods, our approach can provide both compact pruned networks for efficient inference and novel closed-form pruning metrics which are mathematically explainable and thus generalizable to different pruning tasks. While the evolution is conducted on small datasets, our functions shows promising results when applied to more challenging datasets, different from those used in the evolution process. For example, on ILSVRC-2012, an evolved function achieves state-of-the-art pruning results.
KW - Channel Pruning
KW - Convolutional Neural Network
KW - Efficient Deep Learning
KW - Genetic Programming
UR - http://www.scopus.com/inward/record.url?scp=85135204355&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85135204355&partnerID=8YFLogxK
U2 - 10.1145/3512290.3528694
DO - 10.1145/3512290.3528694
M3 - Conference contribution
AN - SCOPUS:85135204355
T3 - GECCO 2022 - Proceedings of the 2022 Genetic and Evolutionary Computation Conference
SP - 385
EP - 394
BT - GECCO 2022 - Proceedings of the 2022 Genetic and Evolutionary Computation Conference
PB - Association for Computing Machinery, Inc
T2 - 2022 Genetic and Evolutionary Computation Conference, GECCO 2022
Y2 - 9 July 2022 through 13 July 2022
ER -