Apple Vision Pro demo impressions
Yesterday I went to browse MacBooks at my local Apple Store. I knew that Apple has Vision Pro demos that you can book appointments for, but I assumed this would have limited location or time availability.
To my surprise I saw Vision Pro units on display, and an employee (Kevin) informed me that demo slots were available that afternoon.
Within a few minutes, I was seated at a table for the demo. To start I scanned my head shape with an iPhone to size the faceplate through a process similar to FaceID scanning (tilt your head in each direction). Meanwhile, Kevin placed my glasses in a machine resempling a drip coffee maker, which scanned the prescription to pick the correct lens inserts for me. I was very impressed how seamless this setup was as a glasses owner.
Kevin asked me what I knew about the Vision Pro, and gave me a brief rundown of its basic navigation gestures with eyes and hands. I appreciated how much effort Apple has put into communicating spatial computing concepts to people in an approachable way. I came into this demo with a fair bit of prior knowledge about immersive technology, but I imagine my parents wouldn’t have much trouble following along either.
Kevin connected the headset to an iPad, which he explained allowed him to spectate my view throughout the experience. I wonder if this was a bespoke employee tool or something end users can access to spectate freinds and family.
Finally, I got to put the device on! The strap expands or contracts with a dial towards the back. You can easily adjust the tightness with one hand while you hold the front of the device with the other. The Quest 3 on the other hand feels unweildy to adjust, requiring a two handed “pull apart” motion to tighten.
The weight distribution felt fine on my face. Perhaps it simply had good padding, but I didn’t feel as much pressure on my face as I do wearing the Quest 3. Of course, the Vision Pro benefits from not having battery weight in the front. Kevin placed the external battery pack under my thigh while I sat through the demo.
To start the demo, I pressed the digital crown to open my home screen. Immediately I noticed hover effects on the icons as I looked around at them. This initially felt intrusive, since I know that Kevin could watch exactly where my eyes were focusing. I imagine I’d be more comfortable using the device independently without spectators. I wonder if the device has a way to obfuscate gaze effects to external viewers for those who wish to record or stream content to an audience.
Kevin instructed me to open the Photos app and viewing some photos. First I expanded 2D photo, then I “swiped” to the next photo with a pinch motion. Next I swiped to a panorama, which I could expand to wrap around me 180 degrees. There’s a slight parallax effect on the “window” of this panorama, giving it a greater sense of depth. What’s cool is I could continue swiping through panoramas while they’re still expanded around me. The next item I scrolled to was a “spatial video” captured on Vision Pro. This is ultimately a stereo video that anyone who’s used VR is accustomed to, but I was impressed by how accurately the scale was represented. It was a video of a girl blowing out birthday candles, and both the girl and cake matched the scale of the real world. I wonder if they are doing some reprojection to account for IPD differences. There is also a nice softening effect on the edges of the video frame, so the content melts into the world around you. The proceeding video was another spatial video, this time shot on iPhone. This one had weird scaling, likely due to the narrower camera positioning on iPhone.
Next I practiced moving the Photos app window around. I simply looked at the bar under the window, pinched, and moved my hand. Apple did a great job tuning this to feel natural. I was initially skeptical of how the relative motion of my hand after the pinch would allow me to reliably position a distant element, but I had no trouble moving it. Even forward and backward movement was simple, and the window dynamically changed size to account for distance.
I opened another app, Safari, to try multitasking. When two windows overlap, the one you’re focusing on (i.e. last interacted with) pokes out from underneath any windows that obscure it with a similar edge melting effect as I saw in fullscreen Photos. I opened the Mail app for a triple window configuration and practiced resizing windows before closing all the windows.
At this point Kevin instructed me to twist the digital crown to enter an immersive environment of Mt. Hood. He then swapped out the environment with one of the moon, and one of another scenic landmark whose name escapes me. It looked pretty photorealistic but ultimately a 3D model.
After I twisted the digital crown to return to the real world, I opened Apple TV and loaded a trailer for the Mario movie (coincidentally it was March 10 i.e. Mario day). Pretty nice but nothing I hadn’t seen before in VR. The next clip was a montage of immersive videos (i.e. 180 degree stereo video) but again the “wow” factor there had already worn off several VR headsets ago for me.
It was fun to geek out over immersive tech with this employee for a bit, especially given how out of the blue the demo felt for me. I was very pleased with the passthrough AR implemention of the headset and overall user experience. It pains me to return home to my Quest 3 which has no eye tracking and terribly inaccurate hand tracking, which makes basic UI navigation feel like a competitive sport. Although Mark Zuckerburg claims that Quest 3 is the better product, I have my doubts. I can’t bring myself to buy another headset as difficult to use as Quest 3, so while I appreciate Mark keeping the prices low, I hope he’s taking design notes.