Generative AI Enters Next Phase with Natural Language

It’s not about typing or shouting executable commands.

It’s not about typing or shouting executable commands.

NVIDIA Edify, a generative AI program, will likely become part of 3D content generation based on natural language prompts. Image courtesy of NVIDIA.


Generative design (dGD) made finite element analysis (FEA) and computational fluid dynamics (CFD) much simpler, by automating and hiding many aspects of the simulation workflow. Instead of manually setting up boundary conditions one parameter at a time and constructing scenarios one at a time, users can use the desired outcome (a target percentage in weight reduction, for example) to identify the best topology.

Now, the introduction of natural language in design and simulation software is about to further transform the process. Users might expect simpler, easier user interfaces (UIs), driven less by menus and dialog boxes, and more by text or voice input. But it’s not about typing or shouting commands you use to select from a menu.

Does Your FEA Speak English?

In July 2023, Ansys launched AnsysGPT, a limited beta version of its AI-powered chatbot for tech support. “Developed using state-of-the-art ChatGPT technology available via the Microsoft Azure OpenAI Service, AnsysGPT uses well-sourced Ansys public data to answer technical questions concerning Ansys products, relevant physics and engineering topics within one comprehensive tool,” states Ansys in its announcement.

This April, Ansys released its AnsysGPT officially. “The updated release follows rigorous testing of response accuracy, performance and data compliance. AnsysGPT captures knowledge from new public sources, including product documentation, product and engineering-related training documentation, FAQs, technical marketing materials and public Ansys Learning Forum discussions. Additionally, the upgraded infrastructure offers enhanced security and scalability to accommodate thousands of users,” Ansys reports.

“AnsysGPT acts as a virtual assistant providing instant responses to customer’s support questions in their preferred language,” says Ilya Tolchinsky, lead product manager, Ansys. “It’s useful, but basically, it’s just fetching information.”

For the software to move beyond answering how-to type questions and begin performing user-requested tasks, Tolchinsky believes it needs to be able to generate executable code based on natural language input.

“Any user action within Ansys products can be translated and represented as executable Python code,” Tolchinsky says.

As it turns out, Ansys has already laid the groundwork for it. In 2022, Ansys developed PyAnsys, a family of Python libraries that lets users interact with Ansys products. “We’ve already built a version that can generate 80 percent of the required code and will be able to make it available soon. But the rest takes time, because we need to make sure all the commands are executed well and reliably. We will get there soon. When we get there, it will be like having an expert copilot that can perform tasks for you,” says Tolchinsky.

Tech giants working with large language models (LLMs) include NVIDIA, Microsoft and Amazon. As a partner, Ansys can also have access to their LLM-driven features.

Tolchinsky says, “We’re developing some of our own technology, but where it makes sense, we’ll also leverage what’s available from our partners.”

An Advisor by Your Side

The standard interface in CAD and FEA programs present objects in a 3D view. The mouse has proven to be quite efficient at rotating objects and selecting topology features, such as for selecting a hole on the top surface.

“We’re going to augment it partly with natural language input, be it voice or text, but I suspect nobody wants to be talking or texting the software all the time. When you need to select or rotate, you’ll mostly want to do it by pointing and clicking,” says Tolchinsky. “But once you’ve selected an item, you may want to just say or type, ‘double its size,’ for example.”

The typical GD input parameters, such as temperatures and pressures, will still be required, but in some instances, users may be able to give verbal or text commands to move the process along, as Tolchinsky sees it.

At GTC 2024, NVIDIA announced microservices to address a range of functions, including natural language processing. Image courtesy of NVIDIA.

“It would be like having a colleague who is an advanced user sitting next to you. Natural language input will certainly reduce the amount of clicking,” says Tolchinsky. “But what’s more interesting is, when you get back the GD results, you could have a conversation with the chatbot. You might ask, for instance, why is there a temperature spike in a certain spot, and how do you reduce it?”

For new users, Tolchinsky thinks the ability to chat or text will make GD less intimidating. “Let’s say you want to perform a drop test on a design. You can just ask, ‘How do I do this?,’ and the chatbot can guide you through it, step by step,” he says.

The integration, Tolchinsky expects, will be welcomed by both experienced users as well as new users.

“For the expert, we can eliminate a lot of the boring, repetitive tasks; for the new users, it will be like having an expert guide,” he says.

Omniverse APIs

At the GPU Technology Conference (GTC) in May, NVIDIA CEO Jensen Huang announced cloud application programming interfaces (APIs) to Omniverse, its immersive visualization and simulation environment. Partners like Siemens, Cadence and Ansys are expected to take advantage of these APIs.

The APIs are part of NVIDIA’s long-term strategy for Omniverse, according to Mike Geyer, head of digital twins at NVIDIA. “Omniverse is finding success in the market through integrations with software from NVIDIA [independent software vendor (ISV)] partners. Natural language processing is among the many microservices we provide at NVIDIA,” he says. With natural language input, Geyer believes user interaction with complex CAD and FEA programs will become much more natural—“like talking to a chatbot.”

At GTC 2024, Siemens announced it will be integrating NVIDIA APIs into two of its products.

“In collaboration with NVIDIA, we will bring accelerated computing, generative AI and Omniverse integration across the Siemens Xcelerator portfolio,” says Roland Busch, CEO of Siemens.

Dassault Systèmes, another ISV partner, also demonstrated how it was incorporating NVIDIA APIs into 3DEXCITE. In a web-accessible interface, the app allows users to generate automotive designs and the desired backdrops using natural language. The app is powered by NVIDIA Edify, trained on Shutterstock’s licensed creative data, according to Dassault.

“We think generative AI will have the biggest impact on the interface between people and machines,” says Tom Acland, CEO of 3DEXCITE, a Dassault Systèmes brand. “It will power more fluid, [and] more natural collaboration with computers.”

Some SIGGRAPH 2023 conference attendees already saw a glimpse of what natural language-powered design might look like. At the conference, NVIDIA invited a select group of automotive designers to showcase a new workflow based on Stable Diffusion. The prototype application accepts both text strings and images as input to generate professional-looking 2D automotive sketches, complete with backgrounds.

“The AI-generated design sketch may not be 100 percent of what you want, but it’s going to create new and unexpected directions critical to design exploration,” Peter Pang, senior product manager, virtual and augmented reality at NVIDIA, says.

Thinking Beyond Dictation

Converting voice into text is no longer a technical barrier; it’s a feature available in many office productivity software offerings today. For design and simulation software developers, replacing the mouse and keyboard input with text or voice command (for example, saying or typing “extrude” instead of selecting the “extrude” command) is not the best use of natural language. Rather, it shortchanges the true potential of the new paradigm.

The goal should be to allow users to “have a natural conversation with the software,” says Geyer. “It reduces the learning curve for programs like FEA or CFD.” Since NVIDIA microservices are cloud APIs, “for ISVs with cloud architecture, it can be much more approachable to implement these APIs,” Geyer adds. “One issue they need to address involves training data—you need a good collection of data for training purposes.”

At the moment, if you want to run a CFD study, you would likely be required to specify the surfaces and regions to apply the load, with specific numerical values. This approach demands expert knowledge. But imagine being able to ask the software, “Based on the geometry I have uploaded, what setup should I use for an external airflow analysis?”

More Ansys Coverage

More Autodesk Coverage

Meet the Latest Star Wars Droid Designers
Droid design contest winners discuss process, inspiration.
Making and Breaking Things for Fun
Makers and YouTubers blend engineering, entertainment and creativity.
AU 2024: Project Bernini Exemplifies AI-Powered Design
At its annual user event, Autodesk highlights AI's growing role in products for all sectors, and celebrates being selected as partner for LA28 Olympic Games
Digital Transformation at IMTS 2024
Engineering, manufacturing solutions embrace AI and automation.
Autodesk/Makersite Partnership Brings Sustainability to Product Design
Autodesk expands partnership with Makersite across Inventor and Fusion.
Autodesk Company Profile

More Dassault Systemes Coverage

Unified Modeling And Simulation (MODSIM) For Sustainable Product Development
Companies in all sectors must address the need for greater sustainability to meet customer demands. The development of new products necessitates rigorous testing and evaluation, while efforts must be made to decrease emissions and operate in a more economical fashion.
FREE WEBINAR DEC. 5: Augmented Reality for First Time Right Assembly
In this webinar, Dassault Systèmes will discuss how augmented reality (AR) technology can optimize shop floor operations by improving worker productivity, reducing errors, enhance collaboration among teams, and ensure the required level of traceability.
FREE WEBINAR DEC. 12: Transformation Through Modeling and Simulation
In this webinar, Dassault will demonstrate how modeling and simulation (MODSIM) can enable business transformation by reducing development time, accelerating design space exploration, and providing an example of a new electric vehicle design process.
Dassault Systèmes Announces Winners of India-Based Product Design Competition
AAKRUTI Global 2024 aims at giving young engineers a platform to showcase ingenuity, innovation, and creative engineering skills with Dassault Systèmes’ technologies.
3DEXPERIENCE Platform to Enhance EV Development at Volvo
Dassault Systèmes’ 3DEXPERIENCE platform can help streamline collaboration and deliver data-driven approaches, according to Dassault Systemes.
TECHNIA Poised to Expand BIOVIA Customer Base
Virtual twin-focused company buys BIOVIA reseller business from Workflow Informatics Corp.
Dassault Systemes Company Profile

More NVIDIA Coverage

More Siemens Digital Industries Software Coverage

Siemens Digital Industries Software Company Profile

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Kenneth Wong's avatar
Kenneth Wong

Kenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.

      Follow DE
#28947