Omar H. Fares, Toronto Metropolitan University
The metaverse — a shared online space incorporating 3D graphics where users can interact virtually — has been the subject of increased interest and the ambitious goal of big tech companies for the past few years.
Facebook’s rebranding to Meta is the clearest example of this interest. However, despite the billions of dollars that have been invested in the industry, the metaverse has yet to go mainstream.
After the struggles Meta has faced in driving user engagement, many have written off the metaverse as a viable technology for the near future. But the technological landscape is a rapidly evolving one and new advancements can change perceptions and realities quickly.
Apple’s recent announcement of the Vision Pro mixed-reality headset at its annual Worldwide Developers Conference — the company’s largest launch since the Apple Watch was released in 2015 — could be the lifeline the metaverse needs.
About the Vision Pro headset
The Vision Pro headset is spatial computing device that allows users to interact with apps and other digital content using their hands, eyes and voice, all while maintaining a sense of physical presence. It supports 3D object viewing and spatial video recording and photography.
The Vision Pro is a mixed-reality headset, meaning it combines elements of augmented reality (AR) and virtual reality (VR). While VR creates a completely immersive environment, AR overlays virtual elements onto the real world. Users are able to control how immersed they are while using the Vision Pro.
From a technological standpoint, the Vision Pro uses two kinds of microchips: the M2 chip, which is currently used in Macs, and the new R1 chip.
The new R1 chip processes input from 12 cameras, five sensors and six microphones, which reduces the likelihood of any motion sickness given the absence of input delays.
The Vision Pro display system also features a whopping 23 million pixels, meaning it will be able to deliver an almost real-time view of the world with a lag-free environment.
Why do people use new tech?
To gain a better understanding of why Apple’s Vision Pro may throw the metaverse a lifeline, we first need to understand what drives people to accept and use technology. From there, we can make an informed prediction about the future of this new technology.
The first factor that drives the adoption of technology is how easy a piece of technology will be to use, along with the perceived usefulness of the technology. Consumers need to believe technology will add value to their life in order to find it useful.
The second factor that drives the acceptance and use of technology is social circles. People usually look to their family, friends and peers for cues on what is trendy or useful.
The third factor is the level of expected enjoyment of a piece of technology. This is especially important for immersive technologies. Many factors contribute to enjoyment such as system quality, immersion experiences and interactive environment.
The last factor that drives mainstream adoption is affordability. More important, however, is the value derived from new technology — the benefits a user expects to gain, minus costs.
Can Apple save the metaverse?
The launch of the Vision Pro seems to indicate Apple has an understanding of the factors that drive the adoption of new technology.
When it comes to ease of use, the Vision Pro offers an intuitive hand-tracking capability that allows users to interact with simple hand gestures and an impressive eye-tracking technology. Users will have the ability to select virtual items just by looking at them.
The Vision Pro also addresses another crucial metaverse challenge: the digital persona. One of the most compelling features of the metaverse is the ability for users to connect virtually with one another, but many find it challenging to connect with cartoon-like avatars.
The Vision Pro is attempting to circumvent this issue by allowing users to create hyper-realistic digital personas. Users will be able to scan their faces to create digital versions of themselves for the metaverse.
The seamless integration of the Vision Pro into the rest of the Apple ecosystem will also likely to be a selling point for customers.
Lastly, the power of so-called “Apple effect” is another key factor that could contribute to the Vision Pro’s success. Apple has built an extremely loyal customer base over the years by establishing trust and credibility. There’s a good chance customers will be open to trying this new technology because of this.
Privacy and pricing
While Apple seems poised to take on the metaverse, there are still some key factors the company needs to consider.
By its very nature, the metaverse requires a wealth of personal data collection to function effectively. This is because the metaverse is designed to offer personalized experiences for users. The way those experiences are created is by collecting data.
Users will need assurances from Apple that their personal data and interactions with Vision Pro are secure and protected. Apple’s past record of prioritizing data security may be an advantage, but there needs to be continuous effort in this area to avoid loss of trust and consumer confidence.
Price-wise, the Vision Pro costs a whopping US$3,499. This will undoubtedly pose a barrier for users and may prevent widespread adoption of the technology. Apple needs to consider strategies to increase the accessibility of this technology to a broader audience.
As we look to the future of this industry, it’s clear the metaverse is anticipated to be fiercely competitive. While Apple brings cutting-edge technology and a loyal customer base, Meta is still one of the original players in this space and its products are significantly more affordable. In other words, the metaverse is very much alive.
Omar H. Fares, Lecturer in the Ted Rogers School of Retail Management, Toronto Metropolitan University
This article is republished from The Conversation under a Creative Commons license. Read the original article.