Collaborative Simulation
Interoperability in simulation improves accuracy, reduces development times and fosters innovation.
Latest News
May 30, 2024
Engineering design interoperability is crucial for the seamless integration of different simulation systems and software tools, enabling more effective and efficient development processes. By ensuring compatibility across various platforms and technologies, interoperability facilitates the sharing and exchange of data and functions.
This collaborative environment enhances the accuracy of simulations, reduces development times and fosters innovation, which makes it indispensable in tackling complex engineering challenges across diverse disciplines.
Chris Harrold is the program director, developer tools, for Ansys. He says there are two schools of thought about interoperability in software in general. Simulation, he says, is no different. These two things are: everything open and working together; everything closed and protecting its integrity.
“Even with significant advances in interoperability via the developer tools we are building (common language across all simulation tools and physics), the fact remains that data from one type of simulation does not, nor can it, move seamlessly from one type of solver to another,” says Harrold. “A lot of this is by design and some of it is just the nature of the data involved and the complexity of simulation as a discipline.”
Harrold continues, “Is it viable? Of course, and arguably, would be something users would want, provided it was effectively invisible to them (if it requires significant effort, it is no better than the current situation). Is it critical? In true developer fashion, the answer is, ‘it depends.’ I can name a half dozen places where all the big simulation tools exist side by side doing different things and customers do not seem to have an issue so great it prevents their success. So, until that pain catches up, there’s little incentive to be that open from any software provider.”
Interoperability in simulation is a means to help reduce design risk and overall cost in product and system design. Dan Papert is a project engineering advisor for the American Society of Mechanical Engineers (ASME) and product manager for its verification, validation and uncertainty quantification (VVUQ) portfolio.
He says that simulation operability is seeing widespread adoption in the engineering world for various reasons.
“From a business perspective, computational models significantly help reduce risk and cost associated with prototype construction and testing, accelerating product development,” Papert says.
He notes that improvements in computational efficiency and fidelity enable better quality visualization, especially in complex use cases. The tools that develop these models open up collaboration on common platforms.
“Interoperability and credibility in modeling and simulation is crucial for product development, particularly for high-risk applications,” Papert explains.
He shares an example of how nuclear power plant computational models are validated through experimental data gathered from scaled facilities where conditions are similar but scaled down. “This validation may include operating the experiment at lower powers and pressures, at a reduced size, or using other fluids. Best practices for scaling analysis during model validation are critical for determining applicability of these computational models to real-world conditions,” he says.
Challenges of Interoperability
Interoperability offers advantages in digital engineering, but is not without its challenges. An obvious challenge is the compatibility of one engineering design solution or tool with another.
Ed Fontes is the chief technology officer for COMSOL. He says that just working in different formats is a foremost challenge.
“Generally, a challenge with interoperability is compatibility of different formats,” according to Fontes, who explains that CAD involves various industry standards that most software platforms adhere to.
“It is relatively straightforward to read and write in standard formats between different platforms, even though you may lose some information about the sequence of operations and the parameterization of the geometry design,” Fontes says. Likewise, it’s not equally straightforward for other types of interoperability, he adds. “There are simply no widespread standards that work for modeling and simulation files in general. Ideally, the data or underlying data would be compatible, but in order for interoperability to manifest, there must be compatible standard formats covering the different steps in an engineering workflow, for different branches of engineering.”
The additive effect of combining the strengths of different software is a big advantage in achieving interoperability. But the data exchange must be well-managed. Alex Graham is a senior marketing manager with Siemens Digital Industries Software’s Simcenter. He says that interoperability means connecting simulation models from different tools and engineering domains.
“These could be managing header alignment between export and import of data originating from multiple sources, coordinating data exchange during live co-simulation between two or more models, or in tracking, documenting and communicating the human process followed and parameters used in a complex study involving multiple models that are continuously evolving,” says Graham.
Siemens Digital Industries helps engineers tackle such challenges arising from model interoperability in two ways. “The first, and an underlying principle of Siemens software, is to provide an open ecosystem which does not tie users into a particular software suite or vendor offering,” he says.
“This is achieved with a highly customizable framework and by supporting standard model and data exchange formats such as the functional mock-up interface (FMI) and the Open Neural Network Exchange (ONNX) format,” Graham says. “The second is to equip engineers with highly capable tools that can manage simulation processes and data and orchestrate data exchange connections between different CAE software packages.”
Interoperability With a Digital Twins Strategy
A notable benefit of interoperability in building simulation is a pathway to more effective digital twins. Digital twins have certainly come of age and are used in many different areas across many different industries. Many firms are building and relying on a digital twin strategy. This is one area where interoperability in simulation may play a prominent role.
Wesley Hand is a systems design solutions segment manager with Keysight Technologies. Hand points out that simulation operability is not only viable, but crucial to the modern digital twin strategy needed for a shift-left design approach.
“Designers need to use the best-in-class simulation tools to drive complex designs,” he says. “Simulation tools often have a focus where they excel, and by combining the various simulators, the designers can achieve maximum fidelity and accuracy in simulation results, which reduces the overall need to produce multiple prototype iterations [and dramatically reduces] the costs of development and accelerates time to market.”
He adds that product development involves both design (e.g., simulation) and verification (e.g., testing) as it applies to interoperability and for digital twins development.
“A critical concept of digital twins is the continuous feedback loop between these two components,” says Hand. “The better aligned the tools are between design and verification, the more accurate the simulations become. We use the same IP for driving simulations and verifications, from algorithms that predict the designs to the measurement science used to validate the results, to the analysis tools to visualize the results from both. This gives the designer the ability to do apples-to-apples comparisons. The interoperability among simulators also extends into the field of verification using hardware-in-the-loop techniques to more accurately take measurements using best-in-class measurement tools and simulation data.”
In building digital twins and bringing a virtual model forward, interoperability depends on a confluence of so many different factors in the design phase. To efficiently engineer innovative products with confidence, original equipment manufacturers (OEMs) require a modern, virtual approach to industry processes that facilitates a holistic, interdisciplinary way of working during the design phase, involving concurrent engineering of electronics, electrical and mechanics domains, and collaborative teamwork including extensive virtual testing and validation, says Emmanuel Leroy, who is the executive vice-president and chief product and technology officer at ESI Group (a part of Keysight Technologies) in France.
“We are entering a golden age of product development,” says Leroy. “The latest generation of solutions combined with a properly designed digital thread is enabling practically sci-fi-level capabilities. We’ve come a long way from hand-coded [finite element analysis (FEA)] meshes to now working with virtual reality and the metaverse and ESI Group has participated in every step along the way. Manufacturers and vendors commonly focus on building a digital thread that unites the different product realization disciplines bringing simulation onto the critical path and ensuring that the right trade-offs between accuracy and time to market are made.”
Dean Palfreyman is the senior director for SIMULIA strategy at Dassault Systèmes. He says that data interoperability is extremely important for performing coupled-multiphysics simulation, model-based systems engineering (MBSE) as well as integrating the simulation applications with 3D design and data management applications.
“There is a critical need within the manufacturing industry to improve efficiency and simulation reliability by bringing design and simulation technologies closer together. At Dassault Systèmes, we are meeting this demand through unified modeling and simulation, which we refer to as MODSIM.”
Palfreyman says that it is an ongoing issue for product development and manufacturing companies to seamlessly unify product designs with multiphysics simulation to evaluate complex product and system scenarios and gain deeper insight into their product and system behavior.
“Interchanging data seamlessly between multiple teams (such as designers interacting with simulation analysts) throughout the product development process is key to expanding the use of simulation. The success of ‘simulation democratization’ hinges on transparent, seamless and trustworthy data interoperability.”
Interoperability, Simulation, and Confidence
in Design
Simulation, analysis and interoperability lead to better designs, better products, and better systems. It’s all about using solutions and tools at hand via interoperability, and then building confidence in a design via testing and validation.
“Simulation and analysis must reflect this reality, allowing designers to clearly visualize the consequences of their design decisions across multiple systems,” says ESI Group’s Leroy. “Otherwise, the verification and validation results are not aligned, [and] engineers will need to continue relying on physical testing thus undermining the entire commitment to the safety certification process and sustainability goals.
He says the key is virtual prototyping and concurrent collaborative engineering. “Validating the performances at a system-level as manufactured and even being able to include as early as possible the constraints from the final assembly line and from the serviceability of the product.”
More American Society of Mechanical Engineers Coverage
More Ansys Coverage
More COMSOL Coverage
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
Jim RomeoJim Romeo is a freelance writer based in Chesapeake, VA. Send e-mail about this article to [email protected].
Follow DE