Methodical Design and Trimming of Deep Learning Networks: Enhancing External BP Learning with Internal Omnipresent-supervision Training Paradigm

S. Y. Kung, Zejiang Hou, Yuchen Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Back-propagation (BP) is now a classic learning paradigm whose source of supervision is exclusively from the external (input/output) nodes. Consequently, BP is easily vulnerable to curse-of-depth in (very) Deep Learning Networks (DLNs). This prompts us to advocate Internal Neuron's Learnablility (INL) with (1)internal teacher labels (ITL); and (2)internal optimization metrics (IOM) for evaluating hidden layers/nodes. Conceptually, INL is a step beyond the notion of Internal Neuron's Explainablility (INE), championed by DARPA's XAI (or AI3.0). Practically, INL facilitates a structure/parameter NP-iterative learning for (supervised) deep compression/quantization: simultaneously trimming hidden nodes and raising accuracy. Pursuant to our simulations, the NP-iteration appears to outperform several prominent pruning methods in the literature.

Original languageEnglish (US)
Title of host publication2019 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2019 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages8058-8062
Number of pages5
ISBN (Electronic)9781479981311
DOIs
StatePublished - May 2019
Event44th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2019 - Brighton, United Kingdom
Duration: May 12 2019May 17 2019

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Volume2019-May
ISSN (Print)1520-6149

Conference

Conference44th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2019
CountryUnited Kingdom
CityBrighton
Period5/12/195/17/19

All Science Journal Classification (ASJC) codes

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Keywords

  • (supervised) deep compression/quantization
  • BPOS NP-iteratom
  • Internal Learning
  • Internal Optimization Metrics (IOM)
  • structural-parameter learning

Fingerprint Dive into the research topics of 'Methodical Design and Trimming of Deep Learning Networks: Enhancing External BP Learning with Internal Omnipresent-supervision Training Paradigm'. Together they form a unique fingerprint.

  • Cite this

    Kung, S. Y., Hou, Z., & Liu, Y. (2019). Methodical Design and Trimming of Deep Learning Networks: Enhancing External BP Learning with Internal Omnipresent-supervision Training Paradigm. In 2019 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2019 - Proceedings (pp. 8058-8062). [8682208] (ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings; Vol. 2019-May). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICASSP.2019.8682208