Augmented/Virtual Reality Technology Portfolio

Enhanced Eyebox Expansion for Maxwellian View AR Displays (Case No. 2024-163)

Augmented reality (AR) waveguide displays have revolutionized the way users can interact with virtual content overlaid in the real world. High quality, responsive AR systems have diverse applications that can revolutionize industries ranging from education, healthcare, industrial training, and gaming. AR system development has been focused on the quality and adaptability of the “eyebox” region, where users can observe virtual images with high clarity and brightness. Maxwellian view displays, standard optical platforms for AR systems, are hardware that create AR images that are in focus at all distances. They also have the advantage of being lightweight, making them well-suited for head-mounted AR systems. However, Maxwellian view displays have a limited eyebox, restricting their widespread adoption. There is an unmet need to adapt these displays to expand the eyebox to realize the potential of AR platforms in numerous industries.

Researchers led by Professor Chee Wei Wong in the Department of Electrical & Computer Engineering have developed a metasurface glass system that enables precise modulation of the eyebox in a Maxwellian view display. Their system’s metasurface can be easily produced using silicon foundry fabrication technology, enabling scalable manufacturing. Beams of light can be precisely aimed directly at the eyes of users via the metasurface and the size of the eyebox can be dynamically expanded with high resolution. By increasing eyebox size without compromising image quality, this technology has the potential to be transformative in many different AR applications including immersive entertainment, medical training, industrial & aerospace settings and surgical procedures.  

Holographic Metasurface Grating Elements for Augmented and Virtual Reality (UCLA Case No. 2022-320)

Chee Wei Wong and his team have developed curved holographic aligned nonlinear grafting elements (CHANGE) for use in high resolution augmented and virtual reality. The 1-dimensional shape of the optical element allows for high efficiency diffraction, while the curved design of the device permits high resolution, off-plane diffraction. Due to the modular design of CHANGE, the device can be customized to fit the properties of the light source, ensuring that various wavelengths are diffracted with the desired resolution and in the correct direction. These advances in the design of grating elements are critical for the development of next generation, high resolution and high-performance AR and VR devices. Potential applications include AR/VR devices, holography, medical imaging and color balance correction. 

CMOS-Compatible Single-Layer Waveguide Display for Augmented and Virtual Reality (Case No. 2019-025)

Due to the growing ubiquity of reality augmenting devices, optical focusing components are becoming essential to modern gaming and electronics industries.  These components are made using optical elements and layered glass waveguides, with each layer interacting with a specific wavelength of light. While effective, manufacturing of the components can be difficult and the resulting components have suboptimal image resolution due to the fact that waveguide diffraction patterns aren’t selective to a single wavelength. New methods are needed to fabricate and manufacture optical focusing components more efficiently and with higher resolution to meet the growing utility of reality augmenting devices. UCLA researchers have developed a high resolution optical focusing component that is easy to produce. The component is composed of a diffractive optical element with a single layer waveguide. The component can be fabricated using CMOS-compatible deposition, lithography, and etching techniques. In addition, by using a single layer waveguide, the potential for diffraction interference is eliminated, increasing the component resolution with several prototype devices demonstrating input and output at 1080P resolution.

Using Virtual Reality to Diagnose and Treat Neurological and Neuropsychiatric Disorders (Case No. 2020-498)

Renowned UCLA professor Mayank Mehta has developed a method that uses virtual reality (VR) to not only diagnose and treat neurological disorders, but also help with development of new, more effective pharmaceutical treatments. As a treatment, these VR-based methods are completely noninvasive and cheaper than most conventional remedies. Furthermore, this new method can be tailored to the patient according to the severity of their disorder. It can also improve the development of new pharmaceutical treatments by accurately representing human neurological activity. The invention provides new opportunities for patients living with neurological disorders as it enables novel, safer, and more effective treatments. The inventor also hypothesizes that this platform may be used to effectively test Alzheimer's drugs that have been previously shelved in pre-clinical trials. The proposed method can thus leverage advances in VR and AI to quickly screen drugs for efficacy in human trials.

Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions (Case No. 2022-071)

Researchers at UCLA have developed software to allow freehand AR/VR interface that relies on using the user’s hands as objects within the virtual environment. In this instance the user can mimic an object by changing the orientation of their fingers which the software will then interpret. This allows the users hands to act as that object within the virtual environment and enables intuitive interaction with a virtual environment. The software eliminates the requirement of physical controllers and clunky gloves which reduces costs and barriers for widespread implementation. 

Demonstration Videos: Hand InterfacesHand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions

Wireless and Programmable Recording and Stimulation of Deep Brain Activity in Freely Moving Humans Immersed in Virtual, Augmented and Real-World Environments (Case No. 2020-431)

Neuroprosthetics wirelessly record and stimulate deep brain activity in humans to treat epilepsy, movement disorders (e.g., Parkinson's disease), and other neuropsychiatric disorders. These systems, however, are designed for treatment rather than research, limiting their use to existing treatment and therapy schemes. In order to develop new and improved therapies, neuroprosthetics require enhancements including programmable control and integration with external biosensors and virtual and augmented reality peripherals. UCLA researchers have developed a first-of-its-kind platform that allows for the study of deep brain mechanisms and testing of deep brain stimulation therapies. This lightweight platform (~9 lbs) wirelessly records and stimulates brain activity in freely moving humans integrated with wearables and virtual/augmented reality (VR/AR) technologies. VR/AR technologies combined with external measurements (e.g., heart rate, skin conductance, respiration, eye-tracking, and scalp EEG) provides a more accurate environment for understanding the neural mechanisms underlying human behaviors.

A Scalable and Tunable Haptic Array Based on Dielectric Elastomer Actuators in a Patch-Like Form Factor (Case No. 2023-203)

Virtual reality (VR) is advancing rapidly and projected to balloon as an industry to over $200 billion by 2029. The success of virtual and augmented reality (AR) is dependent on creating a truly immersive environment for the user. Creating realistic visual stimuli is possible, but other virtual senses remain stunted. Haptic feedback, for example, is a key component of immersion, providing tactile sensations to users, enabling true interaction with the virtual environment. However, advances in haptic devices have not kept pace with VR/AR development. Conventional haptics use mechanically complex actuators, which are limited in their force and displacement output. A device with limited outputs may not accurately simulate the sensation that a user might be expecting, diminishing the realism of the VR experience. Current devices tend to be reliant on auxiliary equipment, as well, hindering the seamless integration of haptic feedback into VR/AR systems. A wearable haptic device that can fully replicate tactile feedback will catapult virtual reality systems to the next level of realism. 

Qibing Pei and team have developed a haptic feedback array in a patch-like form factor that is capable of higher force output and displacement than traditional haptics without the need for secondary operational equipment. This may enhance the overall user experience and allow for more realistic sensations. The haptic devices can be fabricated in different shapes and sizes to create different feedback systems without any increase in manufacturing complexity and while remaining lightweight and wearable. The underlying novel elastomeric material enables large, tunable ranges of critical tactile metrics like device deformability and force output; both of which top out at significantly higher values than current haptic devices.

A Wearable On-Eyelid Sensor Network for Vestibular-Ocular Reflex Assessment (Case No. 2023-166)

Professor Jun Chen and his colleagues have developed a non-invasive sensor for vestibular-ocular reflex (VOR) assessment. It is the first comprehensive and wearable eye tracking system providing high-fidelity and multi-indicator monitoring. The sensor is equipped with an ultrathin structure (~80 μm) and human skin-like mechanical softness (tens of kPa to ~1 MPa), featuring self-powered, waterproof, biocompatible and high-sensitive properties. This meticulously designed sensor array demonstrates a groundbreaking approach to diagnosing vestibular disorders by continuously tracking eye movements. It offers a quantitative analysis of spatiotemporal eye movement data, including velocity, frequency, and intensity. This innovation effectively addresses the limitations inherent in the existing VOR assessment system. Furthermore, the envisioned on-eyelid sensor network will pave the way for in-home VOR testing, data-driven diagnosis, telehealth monitoring, research into VR/AR devices, and various related domains reliant on the real-time capture of eye-generated signals.

 

 

Patent Information:
For More Information:
Nikolaus Traitler
Business Development Officer (BDO)
nick.traitler@tdg.ucla.edu
Inventors: