A distinguishing characteristic of Apple’s “mixed-reality headset with video passthrough” is the external display. When in use, it shows swirls of light, but the main feature is something Apple calls EyeSight, which offers a way for others to connect with you by seeing your eyes, so you don’t seem to be isolated from the rest of the world when you wear your headset.
With tremendous passthrough, Apple Vision Pro helps you remain connected to those around you. EyeSight reveals your eyes and lets those nearby know when you’re using apps or fully immersed in an experience. When someone approaches, Apple Vision Pro simultaneously lets you see the person and reveals your eyes to them.
There’s just one problem. It doesn’t work well at all. The lenticular lenses that are meant to provide a 3D viewing angle effect create a narrow field of view, the glossy exterior class covers everything in glare, the brightness is too dim, and no matter what’s on the display it seems like others can’t really tell if you can see them or not.
It was meant to be a technological solution to the disassociation of wearing a headset around others, but in practice it’s “the bluish glow” that makes Apple’s headset look different from the others. It’s about as useful as the glowing blue ball of light on the top of a HomePod (where you can’t really see it unless you’re right on top of the thing anyway). It adds complexity and expense with little benefit beyond what a little glowing LED would provide.
The idea behind EyeSight
The idea is simple enough. When you wear a VR headset, half of your face is obscured. Even if the headset does pass-through video so you can see the outside world, everyone else has no idea if you can see them or not.
So Apple put a display on the outside that shows your eyes (or rather a digital reconstruction of them, similar to the Persona feature) whenever you can see them. The eyes are animated, turning and blinking when your real eyes are, thanks to the eye-tracking sensors inside. When you’re in a “fully immersive” experience (totally VR), the display shows a blue/purple cloudy glow.
As people approach you, they can sort of fade through your immersive experience or the floating windows you’ve placed, depending on your settings, and your eyes will sort of fade through the glow. Apple even went so far as to use a lenticular lens in front of the EyeSight display, so it shows your eyes from different angles depending on the angle of the viewer.
It’s a neat idea, and until we have actual transparent displays that can render super-sharp graphics on top of them with the ability to occlude light as needed, something like EyeSight is probably the only way for outside people to “see you” in the headset.
If only it, you know, worked.
Foudnry
It just doesn’t work
The EyeSight display just has too many problems. The rendering of your eyes is low-res and blurry, thanks in part to the front display quality and in part to the lenticular lens effect. The actual rendering is distorted so it looks “correct” through the lenticular film.
The display itself is a relatively narrow strip, not even half the size of the full front of the headset. It’s not terribly bright even before the coverings and coatings cut brightness down further. Then there’s the headset itself, which is so fantastically glossy that you see bright highlights all over in nearly all lighting. If you want to actually see someone’s eyes clearly, the room needs to be fairly dimly lit, at which point the passthrough video becomes a grainy mess.
The image at the head of this article compares Apple’s promotion of the feature to the best possible view I could capture, after multiple attempts. It’s glowy, fuzzy, misaligned (my eyebrows are often the center of attention), and in no way suitable for making that “human connection.” EyeSight, at its best, does not facilitate connecting you to the people around you. To everyone else, you’re still “in VR” and they don’t know where you’re looking or what you’re doing. Most of the time it just appears to others to be an ethereal bluish glow, no matter what you seem to be doing inside.
The best use of EyeSight seems to be to provide general information, like the progress bar that appears when you install a software update.
Foundry
EyeSight in the future
The EyeSight concept is sound, but it will take more than a few software updates to make it work well. It’s going to need new hardware.
Given that this first product is called Apple Vision Pro, it’s reasonable to expect that future models will come in both “Pro” and “non-Pro” variants. I would bet that Apple will double down on EyeSight for the Pro version and eliminate it on the non-Pro model to save cost.
After all, there’s little that EyeSight does today that wouldn’t be communicated just as effectively with a simple multi-color LED: A green light could mean the user can see you, an amber light means they can’t, and a red light means the headset is recording.
For EyeSight to really work in a future model, it needs to be physically larger, covering more of the occluded part of your face. The brightness has to be a lot better, along with an anti-reflective plastic coating on the outside of the Vision Pro glass. This might make the “eyes” part work, but Apple could probably make the display do more to convey other useful information. The current “recording” animation is not intuitive to most people–the universally recognized designation for a recording camera is a red light, Apple should lean into that with a clear red blinking dot. A quick tap of the top button when the Vision Pro is not being worn might briefly display the battery state of charge on the EyeSight display. Perhaps an external icon of some sort could let viewers know if a user cannot hear you (wearing headphones that are not in transparency mode).
Some of this could be accomplished with software updates today, which would at least give EyeSight more of a reason to exist.