Metaverse

Understanding the Metaverse

Understanding the Metaverse

The main technologies of the metaverse – augmented and virtual reality (AR and VR), augmented reality (XR) for both – are nothing new. In 1990, the US National Aeronautics and Space Administration (Nasa) introduced the Virtual Interface Environment Workstation (VIEW). The agency created VIEW in partnership with VPL Research, a 1980s VR company founded by Jaron Lanier.

The system contained headsets very similar to today’s VR headsets and clothes and gloves equipped with sensors – DataSuit and DataGlove. For AR, the Fraunhofer Society provided an early example in 2004. Game NetAttack used Wi-Fi networks and semi-transparent personal displays in headgear to overlay three-dimensional objects on top of real-world settings.

Players had to find virtual target items in a real environment, while carrying a backpack full of gear. The game had features reminiscent of Niantic’s Pokémon Go An AR game that became a hit in 2016 and was discussed in detail in a Computer Weekly article Useful (and scary) implications of virtualizing reality.

Thus, the idea of ​​exploiting virtual elements and environments for productive and entertaining applications is not new. What is new, however, is that the necessary technologies have become significantly more powerful, drastically smaller and much cheaper over the past decades. Gone are the days of gear backpacks; welcome light smart glasses equipped with sensors with advanced display technologies.

Also, the supporting infrastructure and communication technologies such as the Internet, wireless connectivity, positioning and navigation systems, virtual payment applications and data centers have come a long way. Numerous companies have been working feverishly on metaverse-related applications for years.

But Mark Zuckerberg’s announcement at this year’s South by Southwest (SXSW) festival in March forced them to put their cards on the table and prove to investors that they are on top of AR and VR technology and applications. Zuckerberg, founder and CEO of Meta, also participated remotely at SXSW to present his vision in a session Into the Metaverse: Creators, Commerce, and Connectivity.

Connected technologies caught the attention of companies and consumers, and the metaverse became commercially viable. SXSW highlighted a breadth of use cases, from fashion and music to art and gaming. Business-related considerations included branding and workspaces, but also sustainability and communication. And, of course, the sessions covered a number of technological topics, such as standards, non-fungible tokens (NFT), non-fungible intelligence and the concept of Web 3.0.

The two-stage discussions at the conference gave a broad insight into the technologies and use cases. From Buzz to Reality: The Metaverse Now and Tomorrow offered a cross-industry view of the concept, including Finland’s metaversal ecosystem of startups and companies.

Geoff Bund, Head of Software Partnerships at Varjo, described using the company’s premium XR headset. The company also recently launched its Varjo Reality Cloud platform, which allows users to stream photo-realistic quality location images to other users’ headsets.

Vesa Koivumaa, Head of Growth for Wärtsilä, a manufacturer of industrial equipment for the marine and energy markets, offered insight into ways industrial players can leverage the metaverse for training and maintenance applications. He noted that “the metaverse is a tool for very practical things.” He mentioned the benefits that VR applications can have in education and training. For example, marine trainees can easily familiarize themselves with different parts of the ship, and remote operators will have the opportunity to virtually check operation or inspect safety components.

Koivumaa also highlighted synergistic technologies. “We cannot view the metaverse as a single entity; we have to see what else is happening in technologies,” he said, giving translation technologies as an example. Natural translation can improve communication and collaboration in a significant way, he added.

Coincidentally, at the annual developer conference Google I/O on May 11, 2022, Alphabet presented a prototype of smart glasses that can display real-time translations of conversations.

Miikka Rosendahl, founder and CEO for ZOAN, provided the creator’s view of the metaverse. His company designed the world of Cornerstone, a photo-realistic metaverse. ZOAN is also building its own platform in the metaverse after years of gathering experience in creating VR applications.

Leslie Shannon, head of ecosystem and trends research for Nokia, contributed a perspective on the mostly business-to-business telecommunications providers that will create the infrastructure of the growing metaverse. Shannon described the metaverse as “the fusion of the digital and the physical.”

She noted that while Facebook’s name change to Meta drew attention to the emerging metaverse, she was frustrated that the focus on Meta’s vision limited the view of VR and related applications for many industrial use cases. Shannon highlighted AR as an enabler of the digital-physical union, saying that AR is “transforming the nature of the relationship between humans and computers.” Although people are currently limited to viewing two-dimensional spaces, three-dimensional environments will connect them to the physical world very effectively, she added.

The difference between AR, VR and XR is another source of confusion, she said. Shannon considered AR and VR to be two ends of the spectrum of the fusion of physical and digital, illustrating the nature of the spectrum by referring to HTC’s Proton prototype. The headset/glasses can be switched from VR to AR, thus physically representing the spectrum. She also noted that for many AR applications, a smartphone is sufficient to provide digital layers on top of real-world objects and landscapes.

Varjo’s Bund added that while the underlying technologies and required development skills are very similar, AR and VR are trying to solve very different problems and address very different needs. AR and VR are sometimes equated, but it’s very rare that they can be used interchangeably, he said.

ZOAN’s Rosendahl commented that ten years ago there was a debate about which technology – AR or VR – would win in commercial terms. Today, the question revolves around the nature of the use case and the appropriate technology, he said. VR is great for immersing yourself in an entire virtual world; AR is good for adding dedicated layers of information on top of real objects or urban landscapes.

The Metaverse, in its most basic nature – examined in the The Evolving Metaverse: Commercial Realities of Augmented Reality – allows connecting users who can interact with virtual assets or avatars in an immersive way. Everything else, in my opinion, is negotiable and depends on the design, purpose and application of such environments.

The metaverse should be treated as a general idea, not as a specific definition. I believe that any attempt to define the metaverse in detail will inevitably only limit the potential of the types of environments that could emerge and therefore miss the associated business opportunities.

The metaverse can be many things to many people. The potential range of use cases and applications is likely to be more than the sum of its parts – similar to how the Internet not only established a new way of communicating and doing business, but also created opportunities to design more efficient operations and implement new business models.

Again, limiting our current view of what the metaverse is and can be used for will prevent us from imagining what the new environment will ultimately achieve and enable.

Martin Schwirn is the author of Small Data, Big Disruption: How to Spot Signals of Change and Manage Uncertainty (ISBN 9781632651921). He is also a senior advisor, strategic forecasting at Business Finland, helping startups and existing companies to find their position in the market of the future.

Leave a Comment