Skip to main content Skip to secondary navigation

Docket #: S22-140

Efficient Analog Backpropagation Training Architecture for Photonic Neural Networks

Stanford researchers design and demonstrate a novel in situ backpropagation training algorithm for photonic implementations of neural networks. Backpropagation is a standard algorithm for training neural networks, but its implementation in optical devices was previously not established, although photonic accelerate hybrid neural networks address increasing energy demands to support ML-related inference tasks. The inventors develop an optical setup and experimental platform to test gradients in a 6x6 bidirectional network of photonic Mach-Zehnder interferometer that can measure power at all intermediate points in the photonic circuit via an IR camera. They demonstrate proof of concept of the setup in a computationally intensive linear portion of the photonic neural net, such that a computer performed all other computationally inexpensive differentiation.

Stage of Development
Prototype, where authors demonstrate the protocol on a foundry-manufactured photonic integrated circuit using a single-layer photonic network

Applications

  • Photonic matrix multiply accelerator devices for machine learning/AI-based data centers
  • Adaptive optics and photonics for LIDAR and sensing
  • Photonic circuit sensitivity analysis
  • Error correction of photonic circuits

Advantages

  • Analog gradient updates improve energy efficiency by avoiding analog-to-digital conversion in training neural networks
  • Only linear optical devices require analog gradient measurement
  • Can be flexibly incorporated in popular differentiation packages

Publications

Related Links

Similar Technologies

Explore similar technologies by keyword: