By Liu Jun, Vice President, Inspur Information and General Manager of AI and HPC
Truly understanding the metaverse requires some imagination. The digital world makes so many things possible – from exact digital copies and simulations of real-world environments to digital worlds based on far ranging and inventive creativity. Tech titans like NVIDIA and Meta are taking the lead in defining these new worlds, such as creating digital twins with NVIDIA’s OVX platform, or totally new worlds integrated into Meta’s Horizon concept. We can only guess what the future holds for these new digital worlds and how it will integrate with our own reality.
No doubt the maturing metaverse will require tremendous compute resources. Later in this article we’ll offer our thoughts on the nature of those resources. Bbut first, let’s look at the upcoming developmental stages the metaverse is expected to go through. Gartner, the IT research and consultancy group, predicts that the metaverse will not reach full maturity until 2030, and this will occur in three stages:
- Emerging – By 2024, the metaverse market will begin to open with experimental applications and use cases through established technologies like augmented and virtual reality.
- Advanced – Between 2024 and 2027, content will become king with data becoming the focus in terms of analyzing the relationships between the physical, the digital, and the physical and digital combined. New businesses and applications will develop from understanding these relationships.
- Mature – From 2028 the true metaverse will start to appear. Building upon the new applications of phase 2 and powered by mature hardware, infrastructure will become the primary focus with vendors working to create the true backbone of a transformational and ubiquitous digital world.Stage 1 is underway. However, what people may not appreciate in this progression of metaverse development from stages 1 to 3 is the enormous computing power required to bring it to reality.
We are talking about computing on a massive scale, with the real-time rendering of digital twins and digital worlds, the rapid generation of digital humans, and running of time-accurate, highly precise simulations that obey the laws of physics. Not only does this take a vast amount of computing power, but that power needs to be focused in specific areas. And when digital and physical economies fully integrate, the amount of data, algorithms, the computing power that they will demand, will be another order of magnitude greater. In order to succeed, the metaverse needs a completely new IT foundation; it needs IT bedrock to support its growth and evolution.
This IT infrastructure isn’t just about power. It needs to be adaptive, integrating hardware and software. The hardware must provide computing power suitable for a wide array of compute-intensive scenarios. The software must function as a platform for connecting different tools and algorithms. Combining software tools and algorithms with the right hardware will unlock the infinite possibilities of the metaverse.
The integration of data and reality is the starting point for developing a foundational metaverse-specific server. To measure and digitally reproduce the real world requires the co-creation and real-time rendering and simulation of large-scale, high-complexity, and high-fidelity digital scenes. This is achieved via the four pillars of the metaverse:
- Collaborative creation – software and hardware developers working together to create an optimal digital universe
- Real-time rendering – seeing the environment evolve is key in creating an optimal experience
- High-precision simulation – to create an interesting and compelling immersive experience
- Intelligent interaction – to simulate the real world
Each pillar requires different types of computing power, software and collaboration. It is a colossal undertaking. To realistically handle such demands, you will need a computing system that has the power to support hundreds of digital architects, generate thousands of digital scenarios per second and enable hundreds of AR/VR users to enjoy digital worlds that can handle the collaborative creation and real-time rendering of large-scale, highly complex and highly realistic digital simulations.
The modeling of giant-scale and extremely complex digital twins, avatars and other simulations requires the collaborative creation of numerous designers, using virtualization and cloud synergy. That collaboration is being redefined by AI technology.
Once the modeling of a scene in the digital space is completed, a variety of physical simulations and scientific visualizations are needed to achieve environmental accuracy, such as AI-driven mechanical simulation, thermal simulation, etc. In the same vein, AI-driven digital humans are combined with AI algorithms, including ASR, NLP and DLRM for better performance and life-like performance. The training and inference processes of those models must be supported by new levels of AI computing power.
AI computing power not only plays an important role in intelligent interaction, but also drives the entire creation and operation process of the metaverse, involving evolving methods of the content generation, improving the accuracy of simulation, accelerating the speed of rendering, and enhancing the degree of realism during intelligent interaction.
A metaverse-specific server is the foundational platform of the metaverse ecosystem that will support the various technologies and tools required for its construction and operation. It must aggregate the computing power required for the creation and operation of digital twins and virtual worlds and support the integration of hardware and software for the collaborative creation and real-time rendering of large-scale, highly complex and highly realistic digital simulations.
The creation of digital worlds involves four major tasks: collaborative creation, real-time rendering, high-precision simulation, and intelligent interaction. Each requires different computing power and complex software. What’s needed to create an efficient and collaborative development experience for users is a base for connecting the real and digital worlds. This base includes the real-time rendering of digital twins, the rapid generation of digital humans and running of time-accurate, high-precision simulations that will empower thousands of industries. With these elements in place, we will have the proper foundation needed to support the metaverse.
Jun Liu is a GM of AI and HPC at Inspur Information, a global provider of AI servers. Liu has over 20-years’ experience in AI and HPC and spearheads product development, application and services at Inspur’s Artificial Intelligence and High-Performance Computing division.