Photonic Integrated Multi-Beam Light Engine for Augmented Reality Displays and LiDAR Systems


Sensoren, Geräte und Komponenten

Ref.-Nr.: 1401-6617-WT

This invention introduces a compact, high-performance light engine based on visible-light photonic integrated circuits (PICs), designed for augmented reality (AR) glasses and LiDAR systems. The technology integrates RGB and near-infrared (NIR) lasers on a chip, guiding light through waveguides to emit highly controlled beams via edge couplers. These beams are then collimated and scanned using microelectromechanical mirrors (MEMS) in a double-pass configuration. This chip-scale approach enables multi-beam, high-resolution laser scanning with significantly reduced size, power consumption, and complexity. Due to the PIC’s broadband transparency, simultaneous operation in both visible and NIR wavelengths is possible. The same photonic platform supports both display and depth-sensing functionality, marking a crucial step toward lightweight, integrated AR systems with real-time environmental awareness.

Background

Conventional AR glasses and LiDAR systems rely on discrete optical components - lasers, mirrors, lenses - arranged in complex, bulky assemblies. This setup limits miniaturization, increases manufacturing cost, and imposes performance constraints such as restricted field of view (FOV), slow scanning speeds, and limited resolution. Laser-scanning systems offer advantages in brightness and contrast but are hindered by trade-offs between mirror size, scanning frequency, and laser modulation rates. Multi-beam scanning improves resolution but typically requires more lasers, further complicating integration. In LiDAR, conventional designs struggle with packaging and optical losses, especially in compact mobile or wearable form factors. A fully integrated photonic solution is therefore highly desirable for both AR and sensing applications​.

Technology

The invention centers on a photonic integrated circuit (PIC) that integrates red, green, blue (RGB), and near-infrared (NIR) lasers directly onto a silicon-based chip. These lasers are coupled to waveguides fabricated from materials such as silicon nitride (SiN) or aluminum oxide, which guide the light with low loss across the chip. The waveguides terminate at precision-engineered edge couplers, located on a thin, suspended bridge region of the chip. These couplers emit multiple collimated beams into free space, each beam addressing a unique portion of the field of view. A two-axis MEMS mirror then scans these beams, and after passing twice through a lens system (double-pass configuration), the beams are directed into an optical waveguide combiner for display or reflected back for detection in LiDAR mode.

Figure 1: Schematic of a laser-scanning AR display: Modulated RGB laser beams are combined and scanned by a 2-axis MEMS mirror, then directed via relay optics into a waveguide combiner. The waveguide projects collimated light to the eye, forming an image based on beam angles.

Figure 2: Front view of a waveguide combiner: RGB light enters via the input coupler, propagates by total internal reflection, and is replicated by pupil elements before exiting through the output coupler toward the eye.

This configuration supports the emission of multiple beams in parallel, significantly boosting resolution and scan rate without increasing the number of laser sources. Additional on-chip elements include optical modulators for high-speed intensity modulation and optical switches for beam multiplexing. The architecture is optimized for low crosstalk, minimal optical aberration, and alignment-free packaging. Furthermore, the platform enables dual-functionality - simultaneous AR image projection and 3D environmental sensing - using shared optical paths and chip-level integration, significantly reducing size, weight, and system complexity​.

Advantages

  • Compact Integration: Combines lasers, optics, and waveguides on a single chip, minimizing system volume.
  • High Resolution: Multi-beam scanning increases image sharpness and FOV without requiring ultra-fast mirrors or modulators.
  • Dual Functionality: Supports both AR display and LiDAR sensing in one platform, reducing hardware redundancy.
  • Efficient Optics: Double-pass lens design simplifies alignment and reduces the number of components.
  • Scalable Manufacturing: Compatible with wafer-scale processes for cost-effective mass production

Potential applications

  • AR/MR Glasses: Compact light engines for high-resolution visual overlays.
  • Mobile Devices: Depth sensing and 3D mapping in smartphones or tablets.
  • Automotive LiDAR: Miniaturized modules for navigation and object detection.
  • Robotics: Integrated vision and distance measurement for smart automation.
  • Medical Devices: AR-assisted diagnostics and surgical guidance tools.

PDF Download

Kontaktperson

Senior Patent- & Lizenzmanager

PD Dr. Wolfgang Tröger

Diplom-Physiker

Telefon: 089 / 29 09 19-27
E-Mail:
troeger@max-planck-innovation.de