Lauren Goode, Senior Editor, Wired, interviewed NVIDIA CEO Jensen Huang at SIGGRAPH. Image courtesy of NVIDIA.

DE · Topics · Design · News

NVIDIA Releases OpenUSD and Generative AI Microservices to Attract Omniverse Developers

NVIDIA Promotes Omniverse APIs for Generative AI and OpenUSD at SIGGRAPH

NVIDIA Promotes Omniverse APIs for Generative AI and OpenUSD at SIGGRAPH

At SIGGRAPH 2024, NVIDIA CEO Jensen Huang was first interviewed by Lauren Goode, Senior Writer of Wired, then he in turn interviewed Mark Zuckerberg, CEO of Meta. During his talk with Goode, Huang revealed how users can use text or verbal prompts to automatically generate 3D characters, objects, and scenes. It was part of a collaborative project with stock-art merchant Shutterstock and marketing firm WPP.

“We taught AI how to speak OpenUSD [3D file format for NVIDIA Omniverse]. So the girl [the user] is speaking to Omniverse, Omniverse can generate USD, then Omniverse uses the USD prompt to find the object from its catalog, and then Generative AI uses these conditions to generate the scene. So the work that you do will be much, much better controlled,” said Huang.

Such on-demand content generation is now possible with 2D AI programs such as MidJourney, Dall-E, and Microsoft Image Creator. Replicating the same workflow in 3D is expected to radically change the way people design and model 3D objects in manufacturing, engineering, architecture, films, and games. 

During his talk with Huang, Zuckerberg said, “Just like every business now has an email, in the future, every business will probably have an AI agent.” Echoing this, Huang said, “Everybody can have an AI. In our company, I want every engineer, every software developer to have an AI.” 

In Meta's WhatsApp chat program, the [/imagine] prompt allows you to generate images based on text using Meta AI. 

For more NVIDIA news from SIGGRAPH, read “NVIDIA Advances Humanoid Robotics Development.”

More NIMs to Drive Omniverse-Based Development

During SIGGRAPH, NVIDIA announced the release of more NIMs (NVIDIA Inferencing Microservices) or APIs. They allow developers to create and offer applications built on NVIDIA technology components, such as Generative AI and OpenUSD. NVIDIA writes, “The world’s first generative AI models for OpenUSD development, developed by NVIDIA, will be available as NVIDIA NIM microservices. The models enable developers to incorporate generative AI copilots and agents into USD workflows, broadening the possibilities in 3D worlds and helping speed the adoption of USD across a new range of industrial sectors, like manufacturing, automotive and robotics.”

The new Generative AI NIM uses USD as its 3D language to create objects and scenes based on user input. Though it originated in the film industry, USD is seeing growing adoption in manufacturing and engineering. Siemens, for example, has a partnership with NVIDIA to integrate OpenUSD workflows into its Simcenter portfolio. NVIDIA cofounded the Alliance for Open USD (AOUSD) along with Pixar, Adobe, Apple, and Autodesk. 

Previously NVIDIA demonstrated the ability to generate standard warehouse items with natural language prompts with its AI Room Generator Extension. The code is available from GitHub, and is powered by NVIDIA DeepSearch and GPT 4. When creating environments such as warehouses and reception areas, instead of dragging and dropping items from a library, users can use prompts, such as “add common items found in a warehouse/reception area.” 

NVIDIA releases API for Generative AI based on USD.  Image courtesy of NVIDIA.

More NVIDIA Coverage

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Kenneth Wong's avatar
Kenneth Wong

Kenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.

      Follow DE
#29250