What follows are questions a few developer friends have been asking me via text message this weekend and my (slightly edited) replies.
(I’m mostly writing this so Future Me™️ can look back at my first thoughts in ten years.)
So, what are your first impressions?
Top of my head…
When I was buying this at the Apple Store on Friday afternoon, a woman who had been next to the Vision Pro displays asking questions about them, came up to me with wide eyes and said, “Are you buying that?!?”
It felt very much like when I was in the lobby of a movie theater in June 2007 and someone asked “Is that the iPhone?!?”
And I don’t just mean that their question felt the same; my answer felt the same, too. I lied and sheepishly said, “Uhh, I’m a developer. I need this for work.” I wasn’t brave enough to be honest and say, “I’m excited and want to try it and learn how it works.”
There are so many missing iOS / iPadOS table-stakes features that you’d think would just “come for free,” with visionOS being a derivative of iPadOS, that I can only imagine Apple was 100% focused on the OS infrastructure side of things until last year’s WWDC announcement. And only after that did they bring more engineers onto the project to build out the application-level software.
I’ve gasped and laughed more times than I can count.
I could hit a baseball wearing this; the pass-through responsiveness is so high. But cameras are cameras, and unless you’re in excellent, balanced lighting conditions, the real world looks like motion-blur through a very fine screen door with sharp, crystal-clear floating windows on top.
I’ve never used a VR headset before – ever. I feel eye strain after an hour of using this.
The fully immersive environments are staggeringly good.
Window management sucks. Real bad.
Typing is equally awful. Either use a Bluetooth keyboard or be prepared to dictate everything you want to type. To be fair to Siri, the dictation has worked great so far. I never speak out loud to Siri when using my Mac, and I don’t feel comfortable doing that with my family nearby (or in the next room) while using the headset, either.
I have to do “half immersion” (black out my front 180 degrees) to use it at my desk since I face a wall, and there’s “no room” to put windows in front of me if I leave the pass-through completely open.
I have tourettes, and I constantly find myself triggering accidental inputs from the twitching my head and arms occasionally do. I’m very curious to see what Accessibility accommodations look like as the operating system matures. iOS and macOS have settings for adjusting tap / click sensitivity – visionOS will need those, too. (Maybe they already exist; I still need to look.)
The dual head strap is not comfortable for me. The velcro is too finicky to adjust. The single back strap is way easier to use.
Let me send you a screen recording I did this morning using the MLB app.
Do you think you’ll use it much during work?
Maaaaaaaaybe? I can only imagine doing non-dev work at the moment. Email, Slack, meetings (not on camera). And only with a paired keyboard. It’s just too clunky otherwise. But who knows? It may get easier after I acclimate.
It’s like trying to do work on the first iPad. I mean, yeah, you could probably kludge your way through it. But there are too many rough edges. Same with this. It’ll be a while before tech work is possible.
I haven’t tried many third-party apps. I think anything video will be killer. It’s already the best TV in my house. And the open-ear speakers firing at your ears are crazy. I don’t understand how they work so well.
Sports will be huge in a few years when the on-field cameras catch up to provide more appropriate content.
The thing that makes me sad is it’s a very solitary feeling. Only one person can watch that “best TV in the house” at a time. I can’t share any of this with Liz without her going through an arduous guest setup mode every single time.
My sister made this TikTok from our first FaceTime call Friday night.
Oh god, that looks awful. I mean, it’s cool, yeah but I can see why it’s beta 😂
One reason we eventually figured out was that my eyes always looked “up” on the call, right? That’s because her FaceTime window was positioned above my eye level in my field of view. Once I “lowered” her window, the eye contact was more natural.
Damn, the things you’d never really have to deal with. Woulda been incredible to be there when the AVP engineers were learning all this for the first time.
Like, her position felt totally natural to me but looked bizarre as hell on her end.
Overall, how are you liking it? Does the battery life seem low compared to every other Apple product?
I’ve never used any VR headset before, and I haven’t been able to wear this on my head long enough to run down the battery before getting enough eye strain to where I took a break. But I also still need to finish setting everything up and getting more 3rd party apps to try – and it’s been the weekend. My time wearing it may go up during the work week without kids around when I experiment with trying to do work on it.
This seems in line with what others have told me who got it or tried the in-store demo. Maybe it is good we’re waiting.
I heard a reviewer say, “This is the worst Vision headset Apple will ever ship,” which is accurate and also a hopeful statement.
The hardware might be good enough for mass market use, but the software (and I mean the OS-provided feature set from Apple, not even talking third-party developers) is nowhere near ready enough for more than the earliest tech enthusiasts.
Like, iPhone OS 1.0 nailed the interaction model from the start – and it’s been a steady march of iteration and polish ever since. iPad OS is still trying to figure out how to manage multiple windows.
visionOS is just throwing stuff at the wall (literally and figuratively) and hoping.
Granted, it’s very well-considered stuff on a wall, but they don’t know what works yet. And I can’t blame Apple for that. This is going to take a while.
Post a Comment