neld-abstract
neld-abstract
spl
Di mage
ayedI
mage
o)
UpPhot
ved”I
ose-
Percei
(Cl
“
Figure 1: Enabling thin, lightweight near-eye displays using light field displays. (Left) Our binocular near-eye display prototype comprises a
pair of OLED panels covered with microlens arrays. This design enables a thin head-mounted display, since the black box containing driver
electronics could be waist-mounted with longer OLED ribbon cables. (Right) Due to the limited range of human accommodation, a severely
defocused image is perceived when a bare microdisplay is held close to the eye (here simulated as a close-up photograph of an OLED).
Conventional near-eye displays require bulky magnifying optics to facilitate accommodation. We propose near-eye light field displays as
thin, lightweight alternatives, achieving comfortable viewing by synthesizing a light field corresponding to a virtual scene located within the
accommodation range (here implemented by viewing a microdisplay, depicting interlaced perspectives, through a microlens array).
Abstract 2 Hardware
We propose near-eye light field displays that enable thin, OLED-based HMD Prototype: As shown in Figure 1, a binocular
lightweight head-mounted displays (HMDs) capable of presenting prototype was constructed using a pair of Sony ECX332A OLED
accommodation, convergence, and binocular disparity depth cues. microdisplays. Each 15.36×8.64 mm microdisplay has 1280×720
Sharp images are depicted by out-of-focus elements by synthesiz- 24-bit color pixels (i.e., 83.3 pixels per millimeter). Microlens ar-
ing light fields corresponding to virtual objects within a viewer’s rays were affixed to the displays, weighing 0.7 grams and having a
natural accommodation range. Our primary contribution is to eval- 1.0 mm lens pitch and 3.3 mm focal length. Each assembled eye-
uate the capability of microlens arrays to achieve practical near-eye piece is 1.0 cm thick and achieves a spatial resolution of 146×78
light field displays. Building on concepts shared with existing inte- pixels and a field of view of 29×16 degrees.
gral imaging displays and microlens-based light field cameras, we
optimize performance in the context of near-eye viewing. As with LVT-based Film Prototype: Practical applications will require
light field cameras, our design supports continuous accommodation two refinements in semiconductor manufacturing: higher-resolution
of the eye throughout a finite depth of field; as a result, binocu- and larger-format microdisplays, increasing image sharpness and
lar configurations provide a means to address the accommodation- the field of view, respectively. We emulate such high-resolution
convergence conflict occurring with existing stereoscopic displays. microdisplays using backlit 3.75×3.75 cm color films, developed
We construct a complete prototype display system, comprising: using a light valve technology (LVT) film recorder at 120 pixels per
a custom-fabricated HMD using modified off-the-shelf parts and millimeter. For these film-based prototypes, we estimate a spatial
GPU-accelerated light field renderers (including ray tracing and a resolution of 534×534 pixels and a field of view of 67×67 degrees.
“backward compatible” method for existing stereoscopic content).
1 Overview 3 Software
Light Field Ray Tracing: The LVT and OLED prototypes contain
To be of practical utility, a near-eye display should provide high- microlens arrays with 35×35 and 14×8 lenses, respectively. Di-
resolution, wide-field-of-view imagery with compact, comfortable rectly extending conventional rasterization would require rendering
magnifying optics. However, current magnifier designs typically one projection of the 3D scene for each lens. As an alternative, we
require multiple optical elements to minimize aberrations, leading modified the NVIDIA OptiX GPU-accelerated ray tracing engine
to bulky eyewear with limited fields of view. We consider a simple to support quad buffering in OpenGL—providing the HDMI 1.4a
alternative: placing a light field display directly in front of a user’s frame-packed 3D format required by the OLED driver electronics.
eye (or a pair of such displays for binocular viewing). As shown
in Figure 1, sharp imagery is depicted by synthesizing a light field Supporting Stereoscopic Content: To implement a complete dis-
for a virtual display (or a general 3D scene) within the viewer’s un- play system, a “backward compatibility” option is required for ex-
aided accommodation range. We demonstrate this design enables isting stereoscopic sources, including movies and video games. We
thin, lightweight head-mounted displays (HMDs) with wide fields propose the following solution: emulating the appearance of a con-
of view and addresses accommodation-convergence conflict; how- ventional, planar autostereoscopic display. For our OpenGL-based
ever, these benefits come at a high cost: spatial resolution is re- implementation, each stereoscopic view is rendered to a texture at-
duced with microlens-based designs, although with commensurate tached to a frame buffer object (FBO). A GLSL fragment shader
gains in depth of field and in accurate rendering of retinal defocus. then generates the projections for each lens by sampling the stereo-
Through this work, we demonstrate how to mitigate resolution loss. scopic view textures, as mapped onto the virtual display plane.