Silicon Photonics for Training Deep Neural Networks

Bhavin J. Shastri, Matthew J. Filipovich, Zhimu Guo, Paul R. Prucnal, Sudip Shekhar, Volker J. Sorger

Research output: Contribution to journalConference articlepeer-review


Analog photonic networks as deep learning hardware accelerators are trained on standard digital electronics. We propose an on-chip training of neural networks enabled by a silicon photonic architecture for parallel, efficient, and fast data operations.

Original languageEnglish (US)
JournalOptics InfoBase Conference Papers
StatePublished - 2022
Event2022 Conference on Lasers and Electro-Optics Pacific Rim, CLEO/PR 2022 - Sapporo, Japan
Duration: Aug 31 2022Sep 5 2022

All Science Journal Classification (ASJC) codes

  • Electronic, Optical and Magnetic Materials
  • Mechanics of Materials


Dive into the research topics of 'Silicon Photonics for Training Deep Neural Networks'. Together they form a unique fingerprint.

Cite this