Plato Data Intelligence.
Vertical Search & Ai.

Apple Vision Pro Teardown Reveals The Secrets Of EyeSight

Date:

iFixit tore down Apple Vision Pro, giving a fascinating look at its intricate design and revealing the secrets of EyeSight.

When revealing Vision Pro at WWDC 2023 in June Apple described Vision Pro’s EyeSight feature as using a “curved OLED panel with a lenticular lens” to show a rendered view of your eyes to others in the room with parallax, so the components used were already confirmed. But iFixit’s teardown gives the best look yet at how exactly this works, why it looks so low resolution, and what it looks like without the lens layer.

Vision Pro with the glass front removed.

As well as rendering the two first person views on the internal-facing OLED microdisplays, one for each eye, Apple Vision Pro renders multiple third person views of your upper face for the external facing curved regular OLED display, with each view tailored to a specific viewing angle.

In front of the display is a lenticular lens, a type of thin plastic lens consisting of vertical ridges that each direct light from underneath to only be visible from a narrow particular angle.

These ridges are grouped, with each in the group directing light to a viewing angle that matches one of the rendered views, and Vision Pro displays that rendered view on a vertical strip underneath it.

The result is an image that appears to have depth, known as positive parallax. Your upper face looks as if it’s behind the display, not on it like a flat image.

Apple Vision Pro with the magnifying layer removed.

There are three problems with this approach however, and all of them are evident in Vision Pro.

Firstly, because you’re rendering multiple views but each is only visible via the strips that match its viewing angle, you get a much lower effective resolution. It isn’t yet known how many views there are, but for the sake of example, if it was a 1920 pixel wide display with 6 views each view would only have a resolution of 320 pixels wide.

The other problem is brightness and blur. The ridges scatter, reflect, and absorb light, so much of it doesn’t pass through to the viewer. Further, iFixit found that Vision Pro has another layer that stretches the image to cover a wider area of the front of the headset than it otherwise would, which also reduces brightness and increases blur.

The third problem is that lenticular lenses only work on one axis. If the Vision Pro wearer rotates their head left or right the illusion is maintained, but if they look up or down it breaks.

This is why Vision Pro’s EyeSight has impressive parallax horizontally, but is disappointingly dim and blurry, and the effect breaks veritcally.

EyeSight on Apple Vision Pro, from Scott Stein,

We highly recommend you watch the full video to see just how many components are packed into such a relatively small space.

iFixit also took apart Meta Quest 3 recently, revealing just how much of the internal space its battery takes up, and we recommend you watch that too.

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?