Is the Apple Vision Pro Vestibular Friendly?

A woman with pastel pink hair, wearing a Vision pro headset with hand over mouth

In the world of emerging technology, there has been a lot of buzz about Apple’s newest hardware and venture into the world of AR/VR and mixed-reality devices. There has been overwhelming feedback about the device, it’s cost, it’s battery life, the immersive content, it’s game-changing potential and more.

The public opinion around the Apple Vision Pro seems to be quite diverse. Some users praise its advanced accessibility features and seamless integration with other Apple products, excited about its potential to revolutionize the way visually impaired individuals interact with technology. Others, however, express concerns about the high price point and the exclusivity of the device, noting that affordability and widespread availability are crucial for meaningful impact.

Overall, it appears that while many are optimistic about the potential of the Apple Vision Pro to enhance accessibility, there are valid discussions about its accessibility and inclusivity within the visually impaired community.

What I don’t find people talking about is what this device experience is for people with vestibular disabilities. You might say, well that seems very niche, however it’s not. According to the Vestibular Disorders Association, approximately 69 million Americans experience some form of vestibular dysfunction at some point in their lives. This encompasses a wide range of conditions, including labyrinthitis, vestibular neuritis, and Meniere's disease, among others. These issues can significantly impact an individual's quality of life, leading to symptoms such as dizziness, vertigo, and imbalance.

So with that setup, I’m here.. your vestibular bestie to tell you all about my experience with the Apple Vision Pro!

It was a Saturday morning and I had prescheduled a demo appointment at my local Apple Store. While the mall was a ghost-town, the Apple store was buzzing. It always amazes me how busy the Apple store is, there were people returning their Vision Pro devices they received on pre-order, people there for the demo experience along with your typical Apple store customers.

When making my demo appointment I was asked to indicate if I wore glasses, I do. There was a quick questionnaire about my glasses prior to the visit and the instructions to bring my glasses to the demo appointment. Upon arrival they took my glasses and placed them in a small device on the store floor, this was used to detect my prescription so they could customize the device for my prescription. They also took some facial and head measurements with the associates iPhone. This was all the prep work to ensure when they brought out the device it was ready for my use.

They explained how to navigate the device, a few hand gestures, how to put it on and adjust it. I did let the store associate know that I had a disability that impacted my balance and could cause me to get dizzy or nauseous. This was really important to communicate because I was going to be taken through a guided demo, this allowed the associate to warn me when the content might be uncomfortable in advance. He was also sure to remind me how to exit out of each section should it be uncomfortable.

So my first review for the Apple Vision Pro demo is not about the device, but the store associate. He was really empathetic and mindful to ensure I had a safe and positive experience. So 5 stars for that.

I won’t go into every single element of the demo, but rather I will focus on the parts of the experience that were noteworthy.

Navigating a world with double vision: When I first put on the device I had to calibrate it, because I had the lens inserts in, the first thing I saw was double. I immediately felt this was not going to be good, as I do have double vision normally and my brain was struggling to adjust. However, part of this was also due to the positioning of the device on my head, with a few tweaks I was able to get things centered and my vision to single itself out. Once that was sorted I had no issues with double vision again.

Calibrating eye tracking for a person with gaze stabilization dysfunction and nystagmus: For a quick overview, for those that aren’t familiar with those terms. Gaze stabilization is the function where you are tracking a moving object, your eyes need to smoothly track that object without jumping around. Nystagmus is when the eyes jump around, this can be constant or a triggered behavior, what this looks like to the outsider is your eyes dancing or wiggling. They can dance and wiggle together in unison or independently. This is super common for people with vision impairment and blindness and also very common for people with vestibular disorders.

Fun fact: This is also why someone with vestibular issues, can and should never take a roadside sobriety test. We will always fail because this is what they are testing for when they ask you to follow the pen light.

I was apprehensive about the eye tracking calibration, did they factor in these abnormalities? I can’t answer if they do, but I can say that my experience with the eye tracking for navigation was a little off and I likely attribute this to my vision disability. It wasn’t a blocker but it needed a little refinement.

Panoramic photos get a new perspective: It’s important to clarify that this specific feedback is related to panoramic images taken from an iPhone or similar device using that slow horizontal scanning motion. These are legacy image assets take in a 2D format. This was the part of the experience that was most problematic for me. I exited out of this image about as quickly as I entered. My immediate sensation when opening the panoramic image was “Nope”, I felt disoriented, and generally uncomfortable. The experience did not feel natural or real, there was a warped sensation that I was experiencing much like that of the “vertigo effect” or dolly zoom. I didn’t like it and I was immediately nervous that this entire experience was going to be unpleasant for me. The intention of this panoramic experience was to be able to turn your head and see the image from the perspective of the person that shot it. For me, it felt like the dreaded rides at Disney and Universal with those wrapped 180 screens.

Movie theatre mode: Now this is where things got interesting, this experience is meant to simulate sitting in a darkened theatre. I really liked the concept of being able to remove all distraction from the “scene” and provide a movie-like experience. This is also helpful for those of us that are easily distracted, ND, ADHD etc.

Now, just like the actual movie theatre you have the option to decide where in the theatre you are sitting, up close / front row to the screen, middle rows or up in the balcony. This was where my faith was somewhat restored with this device, much like I don’t enjoy enjoy the experience of front row IRL, I can confirm that was also not for me in this context, but once I was able to watch from the middle row or balcony I was a happy clam.

Movie’s also offered a 3D option for films that have 3D versions. This was actually pretty cool for me, because the typical 3D Glasses theatre experience isn’t great for me, likely related to my double vision issues. But this 3D experience was great! I really could enjoy the dimension offered.

3D home videos change the game: Another clarification on this section because there are two types of 3D/ immersive home videos, those shot on an iPhone 15 Pro using all 3 lenses and those shot using the Apple Vision Pro using 6 cameras. There is a difference.

First lets talk about home videos shot using the Vision Pro mode on the iPhone 15 Pro. This was cool! there was more dimension to the video, I can best explain it like your could perceive depth into the video. The other element I enjoyed was the feathered edges of the video content, there was no harsh end to the content so it blended with the background.

This created a hologram type experience that I found really cool and it felt new and different. It’s also worth noting that you have control over your background, you can view your real environment in full strength, you can dim it down a bit or you can use another virtual environmont to really just go someplace else. Something my Apple demo associate mentioned that really stood out to me was around imagining the concept of watching a 3D video that was shot in the same place you are physically sitting. This was when my mind really unlocked how this might change how we experience home videos. I can really envision an emotional experience where you really feel like you are there, either to relive a tender moment or experience something that you perhaps missed.

Now I said 3D home videos change the game… but it wasn’t until I experienced the home videos taken on the Vision Pro that I was really impressed. With the Vision Pro capture you are using more lenses, so in comparison to how I described the iPhone capture content where you had depth into the video, on this version you had dimension out of the image.

What do I mean? Imagine a scene where someone blows out birthday candles (this was the actual demo) and when the candles are blown and the smoke comes off them, in this case the smoke actually looks like its being blown in your face. You feel surrounded by the video, truly like you are there. It’s WILD.

Now back to the vestibular part of this, I thought this might be really problematic and triggering, but because this is captured from the same POV that you are experiencing it, it just feels real. Since someone blowing out candles wouldn’t be an issue for me in the real world, it presented no issue for me in the Vision Pro world.

However, if someone wearing a Vision pro recorded a point-of -view video in the passenger seat of a race car zooming around at faster than natural speeds. I am certain this type of content would be very episode inducing for me. So the conclusion here is if it’s OK IRL it’s OK in Vision Pro, but if it would be uncomfortable in the real world assume the same caution in Vision Pro.

And finally the true immersive content experience…

The grand finale of the Vision Pro demo was the immersive content experience. And this… was incredible and tricky.

I was directed by the Apple store associate to navigate to a content selection that was something Apple compiled for the demo. The video starts with Alicia Keys, in the studio. It’s shot in HD, you can see the texture on her skin, it’s so detailed and realistic, but also you are in the POV of the person that shot the content PLUS you can look around the room. So not only do you experience depth like you did in the Vision Pro home video content but rather than experience a feathered edge content area, you are in the room. This footage is completely shot in 360. If you look down you see the floor, if you turn around completely you see the people in the room behind you. There is no distortion much like I experienced in the panoramic footage. You are just there. I will say, sometimes I felt a little too close to subject, sorry Alicia, but that is likely more my desire for personal space.

This part of the demo allowed me to visit the forrest and wade in the water as a bear walked by, I also want to note that the spatial sound in this experience was fantastic, even in the noisy store I could perceive the spatial orientation of sound. The associate mentioned that this is also compatible with airpods, which likely would have allowed me to hear the sounds even more clearly.

We ventured to the deep sea, swimming with sharks. This was definitely the only way I will ever experience the deep sea, because scuba diving is not in the cards me. We also climbed mountains, and walked high-wires across mountain tops. Once again, I should note, some of those extreme heights immersive content wasn’t great for me, because they would not be in real life. But I was definitely able to experience things that there is zero chance I could experience for real. We also watched sporting events from the perspective of being right on the field, standing right on the sidelines. I can really see how this will be a game changer for sports.

Sure we browsed the internet and did some more typical tasks, but it is clear to me that the Vision Pro is an entertainment and immersive device, not a utility device (for now).

Overall I found this to be the most vestibularly friendly device I have tried to date. It’s not foolproof, and some things do trigger me, but being able to perceive my actual environment to varying degrees is must-have feature for people with vestibular difficulties.

I also want to note that this entire demo was done seated, this is a requirement from Apple for all customers coming to do a demo. That should tell you something. Sure we have all seen the videos of the guy popping out of his self-driving Tesla wearing a vision pro headset walking around interacting with his virtual dimension. I would say, maybe with lots of practice could I be able to walk around wearing this headset, but there will be crashes and tumbles along the way. Something you might want to be careful about when wearing a $3700 set of goggles on your head.

I really see the potential for this type of device for entertainment and content consumption, I also also see a huge opportunity for designers orf physical spaces to be able to experience their work in a more hyperrealistic format. While I’m not opening my wallet just yet for this device, I appreciate that it gave me my first AR/VR experience that didn’t immediately end with my head in the toilet.

So, that’s progress forward in my book!

Dana Randall

Dana has extensive experience in helping organizations create accessible and inclusive experiences, she has worked with companies of all sizes, from startups to Fortune 500s, in different industries and fields, so her expertise is truly unparalleled.

With Dana's guidance, you can be sure that your websites, apps and digital products are compliant with the latest accessibility standards and guidelines. She knows how to design and develop products that are accessible, understandable, usable and enjoyable for everyone, regardless of ability.

A11y Design Newsletter & Workshops: www.design-a11y.com

https://www.danarandall.com
Next
Next

From decoding success to fostering neurodiverse inclusion