Latest News
September 1, 2019
Boosted by new, easier to use tools and more readily available compute resources, simulation technology is increasingly used more frequently and earlier in the design process, and by a wider array of professionals.
However, the path to truly democratized simulation requires both technological and cultural changes. Digital Engineering spoke with some leading industry organizations about progress toward this democratization, and some of the remaining challenges and opportunities.
We spoke to Matthew Ladzinski, vice president, Americas and special projects, NAFEMS; Joe Walsh, CEO of the ASSESS Initiative; Malcolm Panthaki, co-founder and executive committee member, Rev-Sim.org and VP of Analysis Solutions at Aras; and Chris Smith, senior analyst and cloud computing SME at Smart2Market.
Digital Engineering: We talk a lot in the industry about the “democratization of simulation.” How do you see this process advancing in the industry? Is this a widespread trend, or more prevalent in certain markets?
Ladzinski: First off, I think it’s important to understand what is meant by democratization of engineering simulation. This is not simply the deployment of simulation tools to non-experts. In order to bring simulation up front in the design process and avoid costly and time-sensitive bottlenecks, expert knowledge capture and reuse are critical in developing a safe and reliable environment for users to run a model without requiring a deep understanding of the underlying technology.
In most cases, there are a small percentage of models that lend themselves well to this concept because of repeated use and business impact. What we’re seeing are numerous examples of companies, both small and large and representing all industries, which are leveraging this concept of “democratization” at a small scale but realizing significant savings—both in time and costs.
Walsh: The changing role of engineering simulation is really about business benefits. However, achieving those benefits and associated growth of the engineering simulation market is tempered due to the lack of expertise available. A simulation revolution needs to occur, which will bring a whole new set of opportunities and challenges. The changing role of engineering simulation is really about business drivers for improved competitiveness: increase innovation, increase performance, improve quality and risk management, reduce time and reduce cost.
“In order to bring simulation up front in the design process and avoid costly and time-sensitive bottlenecks, expert knowledge capture and reuse are critical in developing a safe and reliable environment for users to run a model without requiring a deep understanding of the underlying technology.”
Engineering simulation is a major key to all five business drivers in providing better understanding of product and process behavior, variability and risk. However, engineering simulation software is still typically used only by expert analysts and we need to expand the usage to a broader audience.
The goal is not to democratize simulation for the sake of broader use of simulation, but to enable more informed design decisions throughout the entire design process by leveraging simulation early and often.
Panthaki: Progress at various companies has been slow. Some pick it up quite easily and run with it, and others are slow to adopt. I’ve even seen cases where companies begin to pick this up, but then the champions in the company who were moving things along internally leave, and the project dies on the vine. It has to do with culture. To do it well, there has to be a certain level of effort to prove it out, and keep things growing internally.
By definition, you must automate your simulations. That’s the only way this can be done. If you expect non-experts are going to use your current set of tools, it simply won’t happen. In today’s environment, even those tools that allow you to create such automation require a lot of programming and scripting. That increases the cost of implementing democratization, and makes it challenging to maintain.
“The goal is not to democratize simulation for the sake of broader use of simulation, but to enable more informed design decisions throughout the entire design process by leveraging simulation early and often.”
Automation doesn’t mean that someone has to be able to run the simulation. It can be software that runs the automated simulation. There are strategic initiatives like simulating digital twins or additive manufacturing processes, and the hundreds of thousands of simulation required cannot be performed manually. That’s a relatively new development that is pushing simulation automation. So we’re at the point now that automation is essential for all sorts of reasons, not just democratization.
Smith: This is more about just putting the tools closer to the designer. You have to get the designers to use the tools. It’s a corporate or cultural shift that has to occur to make these departments come together, and get the CAD designers trained in using the tools, and then figuring out how much analysis they will do before sending their designs up to engineering.
DE: How could this process be accelerated?
Ladzinski: Regardless of the task, most of us hesitate to take the road less traveled as we want to avoid pitfalls and the general unknown. While NAFEMS has identified democratization as a key theme at events like CAASE20, we realize the importance of working with other organizations in the community like Revolution in Simulation, which is a nonprofit-seeking initiative that offers numerous freely available reference materials on how other organizations have navigated the democratization waters to disseminate helpful information. Another organization addressing the topic of democratization is ASSESS, which has a working group dedicated to the Democratization of Simulation Engineering (DoES).
“Standards have not caught up with the need to classify data in a standardized way, so we’re facing this ‘Tower of Babel’ problem, with every tool using its own language.”
DE: What role could cloud-based simulation play in this process?
Smith: There are benefits the cloud can bring, and some limitations. It takes a lot of compute power to run these simulations, and the cloud can bring that compute power and make it more easily and cheaply available.
The problem is that the applications don’t lend themselves to the way most cloud system are designed. If you look at CFD (computational fluid dynamics), those nodes have to be pretty close together. The vendors don’t have a concept of the location of each system that is being assigned to run a problem.
To get any kind of performance, the machines have to be on the same high-performance network switch. Right now that’s just not built in to the way these jobs are scheduled. They’ll run them on 100 random machines around the data center.
DE: What do you see as the key challenges in making simulation more accessible in the design process?
Ladzinski: While there are several identifiable challenges, the two that stand out to me the most are:
1. Safe, reliable and robust automation templates based on expert knowledge capture and reuse. These take time to build, which is why it’s important to identify the right models and a well-defined process to democratize in order to realize a return on the investment.
2. Cultural and organizational challenges.
Panthaki: We still don’t have enough success stories. That’s where the Revolution in Simulation initiative comes in. It is essential to bring informant and the community together. With a lot of vendors in the fray and information spread out across various sources, it becomes very difficult to find what you need.
We also have to find better techniques for automating simulations. If we continue to rely on large amounts of programming and scripting, this will not be adopted.
Walsh: Those challenges are:
1. Significantly reducing the required level of expertise to do effective and appropriate simulation;
2. Understanding the concept of appropriateness of a simulation rather than numerical accuracy; and
3. Overcoming organizational silos and fiefdoms.
DE: What changes could simulation tool vendors make that would support this concept?
Walsh: Making existing tools easier to use will only make a small dent on this problem. The software vendors need to make the tools “smarter” to enable knowledge capture, and embedded artificial intelligence along with finding ways to remove the complexities to make the simulations invisible.
Panthaki: Companies use a large number of different tools, and each vendor has their own methodologies when it comes to automation. These techniques focus on their particular simulation tool that they offer to the market. There is a missing layer that is vendor- and tool-independent.
Standards have not caught up with the need to classify data in a standardized way, so we’re facing this “Tower of Babel” problem, with every tool using its own language. The tool-independent layer would include a data management layer that is also tool-independent. That is a major hurdle from a vendor perspective.
DE: What about at the educational level—could engineering schools make changes in curriculum that would be helpful for future engineers and analysts?
Ladzinski: We are hearing from industry that they would like to see new hires better prepared upon entering into the workforce. However, universities are challenged with keeping their curriculum current without turning a four-year program into a five-year program.
There are efforts made by universities, like Ohio State University, which recently rolled out a certification program dedicated to finite element principles.
“This is more about just putting the tools closer to the designer. You have to get the designers to use the tools.”
Another great example is Cornell University, which developed a free, web-based, hands-on introduction to engineering simulation. Over the last 10 years, NAFEMS has offered over 15 different FEA (finite element analysis) and CFD web-based courses and trained thousands of engineers from all across the globe. For the foreseeable future, it will be programs like these that will build off the existing curricula and ensure new hires are better prepared for the tasks that lie ahead.
Walsh: The introduction of concepts and simulation tools at the undergraduate level as part of appropriate engineering curriculums; theory to derive and develop should come as graduate work.
Smith: The mindset has to change, so at the university level the engineering students should be exposed to these tools and how to use them. There should be a dialog about how much engineering analysis does a CAD designer need to do? What is the end goal? People should discuss what end goal they are trying to achieve; for every company that will be different.
But how much better would it be to get those tools closer to designers, so that the analyses are not only running virtually in real time, but also suggesting changes based on designer inputs?
Panthaki: That is a very important point. The way simulation is taught drives people to either have nothing to do with it, or to jump in feet first and only receive exposure to expert tools. You need experts, but it shouldn’t be viewed as the only way to run simulation.
Within the system you also need a path where these tools are packaged in a form that others, who don’t need to be experts, are exposed to their power. And those who are being trained as experts should be trained less as button-click tool experts, and more as template builders. As a template builder, you are creating recipes that are reusable, and you end up understanding what you are doing much better. You are writing a recipe for doing simulation, rather than using a simulation tool.
More Aras Coverage
More ASSESS Initiative Coverage
More NAFEMS Coverage
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
Brian AlbrightBrian Albright is the editorial director of Digital Engineering. Contact him at [email protected].
Follow DE