This "Virtual & Augmented Reality" collection showcases groundbreaking innovations from Stanford investigators creating immersive and interactive digital experiences. These technologies are poised to transform entertainment, healthcare, education, and industrial applications by improving user interaction, realism, and accessibility. The portfolio encompasses advanced display technologies such as metasurface displays for AR/VR headsets and resonant scanning designs for fast spatial sampling. The collection also highlights haptic interfaces including wearable devices that simulate weight, stiffness, and grasping sensations for realistic virtual object interaction. Novel input systems complete the offering with high-dimensional virtual keyboards, touch-free gesture control, and improved eye tracking for seamless user interfaces. Software innovations round out the collection with rapid modal sound synthesis, virtual-to-real alignment for AR applications, and brain-computer interfaces for immersive control. Collectively, these innovations enable more realistic, accessible, and practical virtual and augmented reality systems for gaming, medical rehabilitation, remote collaboration, industrial training, and consumer applications. Key investigators driving these advances include Gordon Wetzstein, Mark Brongersma, Sean Follmer, Doug James, Krishna Shenoy, Olav Solgaard, and Sebastian Thrun.