TY - GEN
T1 - Speeding up Computational Morphogenesis with Online Neural Synthetic Gradients
AU - Zhang, Yuyu
AU - Chi, Heng
AU - Chen, Binghong
AU - Elaine Tang, Tsz Ling
AU - Mirabella, Lucia
AU - Song, Le
AU - Paulino, Glaucio H.
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/7/18
Y1 - 2021/7/18
N2 - A wide range of modern science and engineering applications are formulated as optimization problems with a system of partial differential equations (PDEs) as constraints. These PDE-constrained optimization problems are typically solved in a standard discretize-then-optimize approach. In many industry applications that require high-resolution solutions, the discretized constraints can easily have millions or even billions of variables, making it very slow for the standard iterative optimizer to solve the exact gradients. In this work, we propose a general framework to speed up PDE-constrained optimization using online neural synthetic gradients (ONSG) with a novel two-scale optimization scheme. We successfully apply our ONSG framework to computational morphogenesis, a representative and challenging class of PDE-constrained optimization problems. Extensive experiments have demonstrated that our method can significantly speed up computational morphogenesis (also known as topology optimization), and meanwhile maintain the quality of final solution compared to the standard optimizer. On a large-scale 3D optimal design problem with around 1, 400, 000 design variables, our method achieves up to 7.5x speedup while producing optimized designs with comparable objectives.
AB - A wide range of modern science and engineering applications are formulated as optimization problems with a system of partial differential equations (PDEs) as constraints. These PDE-constrained optimization problems are typically solved in a standard discretize-then-optimize approach. In many industry applications that require high-resolution solutions, the discretized constraints can easily have millions or even billions of variables, making it very slow for the standard iterative optimizer to solve the exact gradients. In this work, we propose a general framework to speed up PDE-constrained optimization using online neural synthetic gradients (ONSG) with a novel two-scale optimization scheme. We successfully apply our ONSG framework to computational morphogenesis, a representative and challenging class of PDE-constrained optimization problems. Extensive experiments have demonstrated that our method can significantly speed up computational morphogenesis (also known as topology optimization), and meanwhile maintain the quality of final solution compared to the standard optimizer. On a large-scale 3D optimal design problem with around 1, 400, 000 design variables, our method achieves up to 7.5x speedup while producing optimized designs with comparable objectives.
KW - PDE-constrained optimization
KW - computational morphogenesis
KW - deep learning
KW - neural synthetic gradients
UR - http://www.scopus.com/inward/record.url?scp=85116408132&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85116408132&partnerID=8YFLogxK
U2 - 10.1109/IJCNN52387.2021.9533789
DO - 10.1109/IJCNN52387.2021.9533789
M3 - Conference contribution
AN - SCOPUS:85116408132
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - IJCNN 2021 - International Joint Conference on Neural Networks, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 International Joint Conference on Neural Networks, IJCNN 2021
Y2 - 18 July 2021 through 22 July 2021
ER -