Latest News
August 10, 2022
What is the metaverse? According to NVIDIA CEO Jensen Huang, it's “the next evolution of the internet.” In NVIDIA's special address at SIGGRAPH 2022, Haung traced the metamorphosis of the internet from a stack of web pages to cloud services. “Now Web 3.0 is here. The Metaverse is the internet in 3D, a network of connected, persistent virtual worlds,” he said.
In his vision, web pages will become virtual worlds, and hyperlinks will become hyperjumps into 3D worlds. In a sense, the metaverse he envisions is already here, as virtual dressing rooms operated by retailers, digital twins of real cities maintained by telecommunication firms, and digital replicas of factories and warehouses for logistics analysis.
“The metaverse is a computing platform, requiring new programming models, new computing architecture and standards,” he said. “HTML is the standard language of the 2D web. USD, an open and extensible language of 3D worlds invented by Pixar, is likely the best language of the metaverse.”
This vision will shape NVIDIA's R&D for the coming years. In collaboration with “Pixar, as well as Adobe, Autodesk, Siemens, and a host of other leading companies, NVIDIA will pursue a multi-year roadmap to expand USD’s capabilities beyond visual effects—enabling it to better support industrial metaverse applications in architecture, engineering, manufacturing, scientific computing, robotics and industrial digital twins,” the company announced.
Connected Virtual Worlds
Rev Lebaredian, VP of Omniverse & Simulation at NVIDIA, believes “Websites will become interconnected 3D spaces akin to the world we live in and experience every day. Many of these virtual worlds will be reflections of the real world linked and synchronized in real time.”
He expects the virtual environments' behaviors to “[match] the real world's laws of physics,” but in social and entertainment applications, they may also “break them to make the experiences more fun.” Furthermore, “XR devices and robots will act as portals between our physical world and virtual worlds. Humans will portal into a virtual world with VR and AR devices, while AI is will portal out to our world via physical robots.”
In Omniverse, NVIDIA's interactive virtual world, USD is the common language to describe the shape of things. The company's vision demands USD to be far more than a static shape-depicting file format. Lebaredian thinks it needs to offer programmable interfaces “for composing, editing, querying, rendering, collaborating, and simulating virtual worlds.” With these pieces in place, USD can support extremely large, complex digital twins, “from sprawling factories to global-scale climate change,” the company says.
At SIGGRAPH, NVIDIA announced the release of a collection of free resources to speed USD adoption, including USD assets purpose-built to open up virtual-world building, along with hundreds of on-demand tutorials, documentation, and developer tools. It's also updating and bolstering its collection of Omniverse plug-ins from common 3D programs, such as PTC Creo, SideFX Houdini, Autodesk Alias, Autodesk Civil3D, and Siemens Xcelerator.
Speaking to the Avatars
Simon Yuen, Senior Director of Avatar Technology, NVIDIA Omniverse, believes avatars will be so ubiquitous in the virtual worlds that talking to them would become as natural as talking to humans. This suggests the need for breakthroughs in natural language processing, vision, and facial expressions, among others. “Everything must dynamically update and react to us in milliseconds, just like human conversations,” he said.
NVIDIA has developed a technology called Audio2Face, part of Omniverse. “It has an AI model that can create facial animation directly from voices,” explained Yuen. “We're going to expand our multi-language support. We're looking at improving different people's voice adaptability. So no matter what type of voice input goes in, the network will create the predicted facial animation even more accurately, we're also going to provide a training SDK.”
At SIGGRAPH, NVIDIA announced the release of ACE (Avatar Cloud Engine), a suite of cloud-native AI models to build and deploy interactive avatars. “Demand for digital humans and virtual assistants continues to grow exponentially across industries, but creating and scaling them is getting increasingly complex,” said Kevin Krewell, principal analyst at TIRIAS Research. “NVIDIA’s Omniverse Avatar Cloud Engine brings together all of the AI cloud-based microservices needed to more easily create and deliver lifelike, interactive avatars at scale.”
Working with @Pixar, @Adobe, @Autodesk, @Siemens and other leading companies, NVIDIA will pursue a multi-year roadmap to expand Universal Scene Description's (#openUSD) capabilities beyond visual effects — enabling it to better support industrial #metaverse applications.
— NVIDIA Omniverse (@nvidiaomniverse) August 9, 2022
More NVIDIA Coverage
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
Kenneth WongKenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.
Follow DE