ThinVR

ThinVr is a novel approach to Head Mounted Displays (HMD) that offers a >180 deg Field of View while being much slimmer than anything on the market.

This is a research project at Intel Labs.

We presented at IEEE Virtual Reality 2020 and published in an IEEE TVCG journal paper.

This was a (virtual) demo at SIGGRAPH 2020 in the Emerging Technologies venue.

This won a Best Paper award at IEEE VR 2020

HMD's have not changed significantly in size for a long time. This is due to physical constraints related to the lens and its Focal Length. If you try to make a wide field of view HMD you need a bigger lens and the problems are worst. With thin VR we have achieved a wide FOV in a compact form factor by applying a computational display approach that uses a curved display and a custom made heterogeneous lenslet array. We have proven that the core approach works by developing 2 working prototypes using both a static and a dynamic display.

30 sec. Video Overview

This video will give you a quick taste of the project.

Static Prototype

This is the final static prototype. We needed this so we could show how our approach would work even if we had a display with super high resolution, we used a lithographic print of 2032 PPI, way more than any display would have provided.

Static Image

This is an image taken with a DSLR camera located where your eye would be when wearing the prototype.

Dynamic Prototype

We made a Dynamic prototype in order to prove that this was a viable approach comparable to existing HMD's in the market.

Dynamic View

This is an image taken with a DSLR camera. We moved the camera on an optical track to show how the image looks as the camera slowly enters the "eye box" which is the volume where your eye should be when wearing the prototype in order to see a coherent image.

IEEE Presentation

This is the 15 min. presentation our teammate Josh Ratcliff gave at IEEE VR.

Supplemental Materials

Watch this 3 min. video with more details and views inside the device.

My specific contributions

The original idea came from Joshua Ratcliff. I was asked to build the prototype and ensure all the Human Factors and UX was well covered from the beginning.

I gave a live yet virtual presentation of this project and my contributions at AWE 2020 (Augmented World Expo), please check it out.

Human Factors vs. Physical constraints

My first challenge was to find the balance between the constraints of the materials, like the size of the lenses, their focal length and the weight of the materials to be worn on the face and the human factors of a device that had to work as soon as it was worn for as many people as possible.

First Prototype

This was the most challenging prototype. I knew that once I gave the specifics for the lenses to be manufactured, the would come back with a margin of error. Also, my 3D printed and milled parts where also within a margin of error that would prove too big for the optical device we wanted to build. So I had to make a device that would allow for minute adjustments to accommodate the human and mechanical variations.

A Flexible Display

The hardest thing for the dynamic prototype was finding a flexible display. I had to break apart several phones and carefully remove the LCD screen from the protective glass. This gave us the display but it also made it so that there was no way to drive the display without the actual phone. I had to hack the connection and built a rig that would hold the electronics and battery of the phone.

Size Comparison

In the end I built a device that is 50% smaller than the ones out in the market that offer a similar FOV. Here I took appart a Pimax 5k and stripped it down to only the optical elements in order to make a fair comparison. I made a CAd model of both devices in order to compare the actual volume.

Supplemental materials

To see two full resolution images captured from the static prototype, along with an image and CAD file of the holders that mount both the display and lenslets, please download the zip file