Apple Vision Pro: What is spatial computing?

Apple is going to great lengths to avoid calling the Apple Vision Pro a virtual reality headset. Or augmented reality, or mixed reality, or any other “reality” thing. We wrote a bit about what all of these terms mean last year and they all apply to Apple Vision Pro, but it’s also different from other […]

Apple Vision Pro: What is spatial computing?

Apple is going to great lengths to avoid calling the Apple Vision Pro a virtual reality headset. Or augmented reality, or mixed reality, or any other “reality” thing. We wrote a bit about what all of these terms mean last year and they all apply to Apple Vision Pro, but it’s also different from other headsets.

Instead, the Vision Pro is generally referred to as a spatial computer and using it as spatial computing. Is this verbiage a cynical marketing ploy to justify the exorbitant price and late entry into this growing technology field? Or is spatial computing something other than virtual, augmented or mixed reality?

The answer is: it’s a bit of both. By any normal human measure, Apple Vision Pro is a mixed reality headset, with a wheel for moving between VR and AR views. But it does spatial computing, which is sort of an umbrella term that incorporates most of the AR and VR elements, but also some other things. Let’s break it down.

Definition of spatial computing

There is no single, universally accepted definition of spatial computing, but Wikipedia does a great job of breaking it down.

Traditional computing seems to take place inside the computer and is limited to it. It’s behind that piece of glass on your laptop, smartphone, tablet, whatever. You interact with it within these limits (you tap the glass or manipulate a cursor or character on the screen within the limits of the screen). And whatever the computer generates only interacts with other things generated by the computer.

Spatial computing appears to be happening in the space around the user. It’s in your living room, on your table, or even in a 3D virtual environment that surrounds you. You interact with it in that space – whether you use your hands or your controllers, you touch, tap, grab, pinch or move objects in what appears to be the space around you and not on a flat glass screen. (It may actually be be a flat glass screen in front of your eyes, but the interaction zone is “the space around you.”) Computer objects interact with each other. And the space around you. They sit on your table, bounce off your walls, or even just change perspective as you physically move around them.

Most quality modern virtual reality experiences therefore fall under the umbrella of spatial computing, as do most augmented reality applications. In fact, Magic Leap used the term spatial computing in 2020, three years before Apple Vision Pro was announced.

But spatial computing doesn’t do it to have be VR or AR. Apple has talked a lot about spatial audio, and this is a good potential example: if the audio seems to come from the space around the user, in an interactive way (the sounds seem to keep their place in the environment itself when the user moves their head), which could be considered spatial computing if it is part of a computer input/output system.

Those holographic tables you see in every sci-fi movie, where the protagonists interact with floating graphics to provide some exposure to the audience? There are no headsets or earphones involved, but it would still be considered spatial computing. The same could be said of the famous Holodeck from Star Trek.

Spatial computing doesn’t necessarily mean headsets: that’s where the technology is today.

Disney

How does Apple Vision Pro fit in?

Thus, even if spatial computing does not to have to come with a VR or AR headset, the current state of technology means it has to be.

Yes, Apple Vision Pro is a mixed reality headset. It can play virtual reality content or, thanks to a vast fleet of sensors and high-resolution video pass-through, augmented reality content. You interact with elements in either mode in the 3D space around you, using your hands and eye tracking.

It’s accurate to say that this is a more advanced version of things that have been around for a while (notably Meta’s Quest line).

But that doesn’t mean spatial computing is a lie or just a marketing ploy. He East good marketing – even though Meta Quest 3 is definitely a space computer, it’s marketed as one mixed reality headset, and Apple can make its 7x higher price seem justified by implying that it’s not the same sort of thing. But using the term spatial computing is more than just a marketing gimmick. It’s also an indication of the direction Apple is heading.

Ultimately, Apple is moving toward a whole new era of technology. Computing was done on stationary computers. Portable Computing allows you to take the computer and easily move it elsewhere. Mobile Computing allows you to use the computer on the go and makes your location an important part of how you interact with the device. And now spatial computing actually seems to happen outside the computer, in the spaces around you.

Mixed reality headsets are just the beginning. There is already a technological path to quality augmented reality glasses in the coming years (Google Glass was a heads-up display, not AR), but the frameworks, technologies and considerations developers need to take into account for this type of computer science apply until now. – also launched ideas like holograms. This is a new foundation, a new sea change in computing, and a project with decades of useful growth ahead.

Teknory