Latest News
September 14, 2018
The increasingly complex engineering required to keep up with the demand for products with Internet of Things (IoT) connectivity, topology-optimized lightweighting, additive manufacturing alternatives, and/or digital twin initiatives is mind boggling. Even without such cutting edge projects, the ever-present pressure to design more product iterations and perform more systems engineering in less time means design engineering teams need to take advantage of every efficiency.
Simulation-led design promises to provide that efficiency by allowing engineers to explore concepts before detailed design work, perform more virtual tests than time-consuming and expensive physical tests, and even evaluate how a design change will affect processes further down the line, such as in manufacturing and maintenance and repair operations. However, the benefits of upfront simulation are often countered by two significant drawbacks: 1. the need for simulation analysis expertise and 2. the time it takes simulation jobs to run.
Simulation software vendors have made great strides in flattening the learning curve needed to make simulation an integral part of the design workflow. More and more CAD experts are able to make use of simulation tools to help relieve the bottleneck traditionally faced by relatively few simulation experts analyzing models from a large pool of CAD experts. The democratization of simulation allows a more streamlined workflow to be adopted in which analysts spend their time on more complicated issues and/or models that have already been vetted by design engineers.
Still, as more engineers realize the benefits of upfront simulation, they may be left with a bad first impression if their workstations aren’t properly configured. It’s hard to understand how simulation can boost productivity when your computer bogs down overnight with a simulation run. To overcome this obstacle, design engineers need to understand how to configure their workstations for simulation and when even more powerful on-demand computing resources may be needed.
The Shifting Simulation Landscape
Today’s professional workstations are well equipped to handle many types of simulation jobs, when properly configured. Performing simulation directly on the workstation used for design is more convenient. It also frees up centralized resources, such as a shared cluster or server, for the most computationally demanding simulations. And while on-demand access to incredible computing power is now readily available in the cloud, the cost and procedures for accessing cloud computing is not warranted for every situation.
Properly configured workstations address the simulation sweet spot for many design engineers who value interactive simulations to quickly see how changes to a CAD model affect thermal, stress or aerodynamic properties; and occasionally need to solve more computationally intensive simulation problems. More intensive simulations, such as those involving more degrees of freedom, larger, more complex assemblies, and/or multiple types of physics have traditionally benefited more from additional CPU cores, while interactive simulations benefit from more powerful GPUs.
With NVIDIA’s recently released Quadro GV100 GPU, however, the traditional computing paradigms are shifting. Based on its Volta architecture, known for high-performance computing applications, the GV100 combines compute and graphics capabilities into one GPU designed for workstations. The GV100 was built to address next-generation workflow bottlenecks that can slow real-time ray tracing for photorealistic rendering, deep learning training, immersive virtual reality, and high-fidelity simulation.
Another prime example of how GPU computing is shaking up engineering simulation can be found in ANSYS Discovery Live. The new software uses the NVIDIA CUDA infrastructure for massively parallel computing to unlock the power of NVIDIA GPUs and conduct simulations in real time. Up-front engineers can use it to pose what-if questions, explore design scenarios, and get immediate feedback through instantly updated simulations. ANSYS Discovery Live only needs a dedicated NVIDIA mobile or a desktop GPU with at least 4GB of memory and the latest graphics driver, but a Quadro P5000, P6000 or GP100 is recommended for the best visualization performance. More complex models, especially, will benefit from more powerful GPUs because ANSYS Discovery Live scales simulation fidelity based on the available GPU memory.
Getting Specific with Specs
Despite the ever-advancing state of the art in hardware and software, there are still tried-and-true methods you can use to configure a workstation for simulation.
1. Configure your hardware to match the current and future capabilities of your software.
The most impressive hardware specs don’t mean much if the software you’re using cannot take advantage of them, or won’t soon.
As hardware capabilities advance, software vendors release updates to capitalize on those advances. For example, vendors are constantly updating their software to more efficiently divide tasks among cores, so if you haven’t upgraded your simulation software in awhile, you won’t get all the performance out of newer hardware. In benchmark testing, DE found that moving to the latest generation of hardware and software provided a 4X to 9X increase in speed when running certain simulation jobs in Altair OptiStruct, ANSYS, Autodesk CFD, COMSOL Multiphysics, and Siemens Simcenter simulation software packages.
Scalability is an important factor when determining which CPU and GPU environment is right for your workstation. Consult with hardware and software vendors to see how your software scales across cores and multiple processors. There is typically a performance plateau that will help you determine the processor you need. When looking ahead, ask your simulation software provider for a roadmap of features they plan to release. Configure your workstation to take advantage of the software you will have, not just the version you’re using today.
Of course, it should go without saying that your hardware should be certified for the simulation software you plan to use. Certification solves many hardware-software incompatibilities before you ever see them.
2. Buy more RAM.
One of the constants in workstation configuration is that more RAM makes everything run smoothly. Simulation is no exception. High-fidelity simulations need a lot of RAM. Without enough, the solver may start to access the hard drive (known as swapping or paging), which drastically reduces the speed of the simulation and could simply crash the solver. There is, theoretically, no limit to how much RAM a simulation can consume. It all depends on how many elements are being simulated, and at what fidelity.
Let your typical workload and simulation software recommendations be your guide to RAM, but remember that workstations are multi-tasking marvels. When configuring a workstation for one task, you need to take into account all the other tasks that will be running in the background. That’s why the rule of thumb for memory is to invest in as much RAM as you can afford.
Memory bottlenecks can occur not just because you don’t have enough RAM, but also because the RAM you have isn’t accessible. To increase that accessibility, choose a workstation with more memory slots and have RAM installed in as many of those slots as possible. A workstation with 8GB of RAM in each of eight DIMMs is better than installing 16GB of RAM in four DIMMs.
Lastly, investigate how RAM and processors are related. Some types of RAM, such as error-correcting code (ECC) RAM or RAM caching technologies only work with specific versions of CPUs. Don’t invest in more expensive RAM that your processor won’t support.
3. Storage Goes Beyond SSDs.
Solid-state drives (SSDs) have become standard hard drives in workstations for good reason. They’re faster than traditional spinning hard drives and their costs have fallen significantly to bring them in line with traditional drives. Capacities have increased as well.
However, simulation file sizes increase with their complexity and fidelity. When working with simulations, you often save multiple simulation outputs to study and compare them. As such, if you’re working with large simulation files, your storage needs will be different than occasional simulation users. Your IT environment will also affect workstation storage choices, given policies for network storage and the frequency with which you check files in/out.
Thankfully, workstation manufacturers understand flexible storage is key. For example, the Dell Precision 7920 workstation has a FlexBay design that accommodates up to 10 2.5- or 3.5-inch SATA/SAS drives, or as many as four M.2 or U.2 PCIe NVMe SSDs. With the hot-swap feature on the M.2 and U.2 PCIe NVMe SSDs, you can remove drives without shutting down the workstation.
Ignore Feature Fatigue
With all of the simulation software and workstation options available, it’s easy to get into the “if it ain’t broke, don’t fix it” mindset. That path leads to a dead end. It means you’re being lapped by other engineers who are taking advantage of the latest improvements while you’re ignoring the latest software update because you’re afraid your old hardware will balk at it.
The latest hardware and software enable greater engineering productivity by letting valuable employees get more done in less time. That extra time can pay dividends when used to explore new ideas or simulate that extra variable that yields surprising results. Simulation is a key technology in today’s fast-paced product design and development environment, make sure your workstation is up to the task.
For in-depth recommendations, download the e-guide: “Selecting the Right Workstation for Simulation.”