Waterloo Reality Labs
Fall 2024 - Present
Architected by @Kenny Na
Designed and built by @Isaac Chiu
We attempt to provide a mechanical varifocal solution to the vergence-accommodation conflict present in today's consumer virtual and mixed reality headsets.
The varifocal display system consists of a single biconvex lens set at a fixed position, with the display modules for the eyes being set between the focal length of the lens and the lens itself. We achieve dynamic focal distances by moving the displays closer and further from the lens via motor control.
To quickly and accurately approximate where in the virtual scene the user is looking, we implement real-time gaze-tracking with XIAO ESP32S3 Sense cameras and software courtesy of the open-source project EyeTrackVR. We create a ray cast from the user's eyes in the Unity game engine to find the object in focus and calculate its distance from the user. This proves as the basis for the display control system.
Using the prototype and an example Unity scene, we empirically prove the benefits of varifocal technology with an internal study. Lastly, we discuss the limitations of moving the display modules versus other methods of achieving a varying focal distance.
Virtual reality (VR) headsets today contain a simple, static display system, usually consisting of one refractive lens (or, in recent years, a folded optical design) and a pair of displays to be magnified and viewed by the user. The stereoscopic output of the displays are what enables the simulation of depth to the user. However, due to the fixed nature of the display system, VR headsets can only deliver one fixed distance of focus, often designed to be set to a moderate length of 1 to 2 meters. This means that as VR users attempt to focus on objects closer to or farther from the distance of focus, the accommodation of the VR display system fails to match the vergence of the user’s eyes. This issue, named the vergence-accommodation conflict (VAC), can cause blurriness and discomfort in the visual experience of VR users, as well as eye strain.
VR users are always shifting their gaze and focus, as they would in real life. This makes it immersion-breaking for VR users to, for example, bring their virtual hands near their faces, and have them look extremely blurry, or enter a 3D environment with far-away buildings or mountains, and feel as if they lack real depth or clarity.
This may trigger VR-induced motion sickness in extremely sensitive users, but many more often experience eye-strain due to the fixed focus distance and aforementioned vergence-accommodation conflict.
This makes benefits of solving the vergence-accommodation conflict are clear, especially in the realm of user comfort, but it is also crucial that this problem is solved as the resolution of display panels get higher and higher. The perceived sharpness of a next generation, high-resolution screen could be completely bottle-necked at close distances of focus by the perceived defocus blur from the vergence-accommodation conflict.
This makes display systems with a variable focus distance an absolute requirement for future VR technology.