Latest News
May 1, 2013
If you ask Frank Popielas, Dana Holding’s senior manager for global CAE, to list the types of simulations he performs, you’d better grab a cup of coffee and sit down. It’ll take him a while just to get through the primary ones: “structural simulation to look for stress spots, injection molding simulation for manufacturing process and part geometry optimization, gas flow and cooling flow analysis, steady state and transient conditions, thermal expansion and thermal distribution perspective ” and he goes on. This list covers just a fraction of the spectrum the Dana CAE team uses routinely to virtually develop and validate the functions and performance levels of the drivetrain, sealing and thermal management equipment designed and developed by the company.
An expanded view of the results from a crash test, part of MSC Software’s SimManager software. |
Marc Hertlein, BMW’s project manager for simulation data and processes, has a similar list: “crash simulation, noise, vibration and harshness (NVH), driveline strength, pedestrian protection, head impact, exterior component, forming process, production process ” he begins.
Venkateswara Rao Pechetti works for Mando Softtech India, a Korean automotive component supplier. As executive engineering II of the company’s CAE team, he, too, boasts his own list. “CAE FE-modeling for strength, NVH, durability, fatigue ” he starts.
Those who’ve witnessed the evolution of product lifecycle management (PLM) might see history repeating in the burgeoning simulation lifecycle management (SLM) market. Like PLM, SLM began as an attempt to clean up the data warehouses. In SLM’s case, it’s to sort and archive the mounds of data generated from repeated simulation so engineers can, when necessary, retrieve, refer to and consult past exercises for guidance.
Like PLM, SLM quickly ballooned into process management. The multidisciplinary approach—involving multiple experts investigating the design’s fitness using multiple software packages from competing vendors—is adding complexity to the process. The repetitive nature of simulation—subjecting the same design to slightly different load variables to find the best option—increases the volume of data to sort afterward.
If simulation’s big-data problem is exacerbated by insufficient process control, the solution is to turn the problem on its head. The experience of Dana’s Popielas, BMW’s Hertlein and Mando’s Pechetti show that, with proper simulation protocols in place, the data either shrinks to a manageable size, or becomes intelligible for subsequent reuse.
What Exactly Are You Managing?
Keith Meintjes, Ph.D., practice manager for simulation and analysis at the analyst firm CIMdata, isn’t wild about the term SLM because it sounds like a close relative to PLM. The two terms, in his view, are distinctly different.
“The problems for managing CAE data are very different than for managing CAD data,” he points out. “SLM usually requires different business processes than the organization has already established in PLM.”
For Meintjes, SLM is about doing the right simulation at the right time. “Process automation is important here in terms of simplifying the application of simulation, capturing IP and enabling uniform best practices (standard work),” he notes. “It’s everything from using the right mesh and turbulence model to gaining understanding and support at the highest levels of engineering management that simulation is a strategic capability.”
Trimming the Data
According to the estimate of BMW’s Hertlein, the automaker produces roughly six petabytes (that’s 6 million gigabytes) of simulation results yearly. His challenge, he says, is “to reduce it to one petabyte of relevant data.”
It’s also a race against time: Hertlein reveals that BMW is planning to double the amount of car projects for his group. So while he’s looking for ways to trim the data to a manageable size, the volume of data generated is expected to grow.
Dana’s Popielas points out that without proper simulation management, “you don’t know what you have anymore. You won’t know where the data came from. You’re getting lost.” Data output between 20GB and 25GB is normal for a single simulation job, he estimates.
“Do you really want to save everything? Most of the time, you don’t,” Popielas continues. “So you save the input stats—what you put into the simulation, how you went about it—and the end report.”
Mando’s Pechetti agrees. “It’s really hard to differentiate/manage/retrieve the data, especially when there are small changes,” he says.
The minor differences in the CAD models or the incremental changes to the loads and pressures in simulation runs are not easily detectable by the naked eye. But in Mando’s iterative workflow, multiple simulation runs with such minute changes are essential to find the best product configuration.
As each design is likely to change five to six times over the course of the project, Mando’s data archival issues have become more complex. This can become a headache when the virtual simulation results have to be compared with physical test results. “To find the data, we’d go deep in to the project folders,” Pechetti recalls. “I knew the data was there, but it took too much time, too many tries to find it.”
Many managers overseeing simulation projects quickly come to realize it’s not practical to maintain an ever-expanding database of simulation result files, let alone sorting them in a manner that’s comprehensible for subsequent use. A more sensible approach is to create a virtual environment, essentially a browser-like interface, where users can browse and compare analysis jobs, along with their histories, input parameters and end results.
Standardizing Process
Setting up a simulation scenario means translating a real-world physical event—say, a car crash—into electromechanical forces, thermal loads, pressures and material attributes that can be computed mathematically. It’s a specialty that takes time to develop, and most firms only have a handful of dedicated experts.
BMW uses MSC Software’s SimManager software to manage simulation. Shown here is a series of simulation jobs managed in SimManager, along with thumbnail views of results. |
“Simulation, never mind SLM, is a problem for smaller enterprises,” says CIMdata’s Meintjes. “They do not have IT departments to integrate all this stuff, and they do not have methods groups to figure out the ‘how’ of simulation.”
BNW’s Hertlein proposes a hypothetical scenario: “Just imagine that every engineer starts up his preprocessor and builds a simulation model in his own way. Is he using the most current geometry? Current materials? Does he follow the regulatory requirements? You won’t know.”
The repetitive nature of simulation points to the possibility of a template-driven approach. By reducing a complex simulation job into a predefined workflow with a series of variables that can be altered on demand, you can make it more accessible to the workforce. With such a process standardizing, the output and the history of simulation jobs become easier to manage.
If an automaker wants to market its vehicles in the U.S., for example, it would need to submit proof that the car meets the requirements of the Insurance Institute for Highway Safety. These requirements, Hertlein points out, should be part of the simulation template or workflow managed in an SLM tool. BMW uses MSC Software’s SimManager.
For consistency in simulation, BMW uses what it describes as a “lead car”—a primary vehicle model for each of its brands. In individual simulation runs, the lead car is reconfigured with different components so it can stand in for a station wagon, a convertible or another model. But the primary model remains the same; therefore, so do many of the attributes inherited from it. This approach, made possible by SimManager, allows BMW to cut down on the size and complexity of its simulation jobs, Hertlein says.
Time Regained
Simulation is both compute- and time-intense. Complex simulation jobs can only run on high-performance computing (HPC) servers. Even when HPC equipment is involved, jobs still take hours, days and sometimes weeks to complete. Choreographing the job queue’s ebbs and flows to match the drumbeat of milestones and deadlines is an art in itself.
Mando’s Pechetti says now he can monitor his simulation jobs in Altair Engineering’s Hyperworks Collaboration Tools. “I can assign jobs. I can monitor progress. I can see who’s started and completed the job without approaching the candidate or his or her workstation,” he says.
The variations within each project’s simulation jobs are now managed in Hyperworks Collaboration Tools’ version control safeguards. All projects are located in Mando’s shared server, accessible to engineers and project managers located in different time zones around the world.
This keeps the time spent hunting for reports, inputs and histories to a minimum, Pechetti says. For each simulation run, Pechetti used to spend three to four hours just importing and verifying for the right data. Now, he puts this time to better use by doing quality assurance on the projects that have been completed.
“Before we go into full-blown expansion of simulation into all engineering disciplines, we need to have tools to manage the data, the process, and the decision-making,” notes Dana’s Popielas. “We chose SLM from SIMULIA for that.” SIMULIA, Dassault Systemes’ application for realistic simulation, offers a software product for managing simulation lifecycle. The product itself is dubbed SLM.
“Looking for information is where you can spend a significant amount of your time in simulation,” Popielas adds. “We’re easily saving 20% of that time by deploying SLM.”
According to Popielas, Dana chose SIMULIA SLM because “the software is visually intuitive. It’s in Dassault Systemes’ 3DLive environment. You don’t have to spend a lot of time learning it. This helps a lot in getting adoption among users across all engineering disciplines.”
PLM vs. SLM
Current SLM solutions are supplied by two primary groups: PLM vendors (for example, Dassault Systemes) and simulation software vendors (for example, Altair Engineering and MSC Software). PLM vendors learned long ago that forcing customers to migrate to their own design software or those of their partners wouldn’t work. Most manufacturers juggle multiple CAD systems, and prefer a PLM system that can accommodate a wide variety, including software titles from the PLM vendor’s rivals. SLM suppliers are now at a similar crossroads.
“The SLM and simulation software user base is very diversified,” Popielas says. “That’s why you need to have an open SLM environment to be able to communicate and link up with all the different applications seamlessly. This is also the reason why the end-user base is looking into initiatives to define exchange standards between simulation packages. Vendors are welcome to participate in those initiatives actively, and they do so.”
The simulation data management environment in Dassault Systemes’ SIMULIA, showing results, actions performed and remarks. The screenshot shows a project involving cylinder head preprocessing jobs at Dana Holdings. Image courtesy of Dana Holdings. |
BMW uses MSC Software’s SimManager to manage simulation data, but the company also employs Dassault Systemes’ Abaqus Unified FEA software for vehicle safety simulations. Mando Softtech India uses Altair’s HyperWorks Collaboration Tools to manage its simulation jobs, but it relies on both Altair’s RADIOSS and Dassault Systemes’ Abaqus to perform simulation. SLM products, therefore, serve the user better if they’re developed to accommodate simulation software from a variety of suppliers, including SLM software makers’ rivals.
Some companies treat SLM as a big-data problem. Others contend that the bigger priority is to reduce complexity in the simulation process, to make it more accessible.
Meintjes cites two sets of companies as examples: “Tecplot (Chorus), VCollab and Altair Engineering (HiQube) for the big-data problem. Autodesk, ESI, ANSYS and MSC Software for democratization.” He’s not endorsing the companies named, he clarifies, but merely presenting examples of the different innovative approaches they have offer to tackle the problem of simulation usability.
Meintjes says he believes the answer is not to automate and cloak the experts’ process for others to use: “It is to understand how others, like designers and field engineers, can be helped to get their work done.”
Another conundrum for SLM vendors is satisfying both the extended enterprise and the workgroup requirements, as the need tends to grow from workgroup to enterprise. BMW, for example, began using SimManager at a department level first for NVH and crash tests only, before expanding it to other types of simulation.
“PLM vendors understand the enterprise; CAE vendors understand the workgroup. Neither, in my opinion, has yet bridged the chasm,” Meintjes says.
Staying Small to be Competitive
Many might consider BMW one of the automotive manufacturing titans, but the company doesn’t think of itself as such. “BMW is a small company compared to its competitors, but we’re the most successful,” Hertlein says.
At conferences, Hertlein has met his counterparts from rival carmakers, who seemed puzzled by BMW’s approach. “Last year in Detroit, someone told me, ‘If we have to increase our development process, we hire more people.’ I told them, ‘That’s not what BMW does,’” he relates. “We’re trying to streamline and effectively use our resources. SLM is a big part of the reason we’re able to stay small and be as successful as we are.”
Kenneth Wong is Desktop Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at deskeng.com/facebook.
More Information
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
Kenneth WongKenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.
Follow DE