It’s been a couple of months since the world caught a glimpse of Apple Vision Pro, which marks the beginning of spatial computing. Though the device will not be available till next year, it is capturing the imagination of developers.
Some units have been shipped out to developers and special sessions are being hosted for developers to try out the headset at designated locations.
However, there is not too much information about how the headset functions because developers are being made to sign expansive non-disclosure agreements. At the moment, Apple is offering labs across Cupertino, London, Munich, Shanghai, Singapore, and Tokyo for developers to attend in person. Developer kits are also being shipped directly but that requires a lot of security detail.
What do developers think of Apple Vision Pro? Apple has published some comments from a few of the early users. “I’d been staring at this thing in the simulator for weeks and getting a general sense of how it works, but that was in a box,” said David Smith. “The first time you see your own app running for real, that’s when you get the audible gasp.” David Smith is a prolific developer, podcaster, and self-described planner.
He began working on a version of his app Widgetsmith for spatial computing immediately after the release of the visionOS SDK. But being at the lab is special because it offers a unique opportunity for a full day of hands-on time with Apple Vision Pro before its public release. “The first time you see your own app running for real, that’s when you get the audible gasp,” said Smith.
Another developer who has tried it out is Michael Simmons, who runs Flexibits, which creates apps such as the calendar tool Fantastical. “It was like seeing Fantastical for the first time,” he said. “It felt like I was part of the app.” Simmons has said the labs are like “a proving ground” for future explorations and a chance to push software beyond its current bounds. “A bordered screen can be limiting. Sure, you can scroll, or have multiple monitors, but generally speaking, you’re limited to the edges,” he said. “Experiencing spatial computing not only validated the designs we’d been thinking about — it helped us start thinking not just about left to right or up and down, but beyond borders at all.”
Chris Delbuck, principal design technologist at Slack, said that he had come to think about the way the app might work in three dimensions in new ways. “It instantly got me thinking about how 3D offerings and visuals could come forward in our experiences,” he said. “I wouldn’t have been able to do that without having the device in hand.”
Earlier this month, Bloomberg’s Mark Gurman said he heard the labs had been “under-filled with small amounts of developers” but interest appears to have picked up. There are no labs available on the East Coast of the US, and Apple is not paying for travel or accommodation expenses involved with attending.
Any adult developer can apply to attend the “Vision Pro labs” for free. Developers need to bring either an app they are working on for the Vision Pro, or an iPhone or iPad app that they want to see in augmented reality.
Though the device was unveiled at WWDC this year, work on it may have started many years ago. A patent, ‘Automatically Adjusting Media Display In A Personal Display System’, dates back to 2007 and it relates to a personal display system that may also include an electronic device that can display media among other things.
Apple says the Vision Pro — priced at $3,499 — will launch in the US in early 2024, and in more countries later that year. In June, Bloomberg’s Mark Gurman said there could be plans for a more affordable version of its Vision Pro headset by the end of 2025.
Mathures Paul