Toward Ubiquitous CAE
Latest News
January 22, 2016
Commentary by Wolfgang Gentzsch, The UberCloud
Countless case studies demonstrate the importance of computer-aided engineering (CAE) for engineering insight, product innovation and market competitiveness. But so far, CAE is mostly in the hands of a relatively small, elite crowd, not easily accessible by the large majority of engineers. Despite the ever-increasing complexity of CAE tools, hardware and system components, engineers have never been this close to ubiquitous CAE, as a common tool, for every engineer. The main reason for this big advance is the continuous progress of CAE software tools that assist enormously in the design, development and optimization of manufacturing products. The next chasm on the path toward ubiquitous CAE will be crossed soon by new software container technology that will dramatically facilitate software packageability and portability, increase ease and access and use, and simplify software maintenance and support. It will finally pass CAE into the hands of every engineer.
Container History
“In April 1956, a refitted oil tanker carried fifty-eight shipping containers from Newark to Houston. From that modest beginning, container shipping developed into a huge industry that made the boom in global trade possible,” according to “The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger,” a book by economist Marc Levinson. It tells the dramatic story of the container’s creation, the decade of struggle before it was widely adopted, and the sweeping economic consequences of the sharp fall in transportation costs that containerization brought about. Levinson shows how the container transformed economic geography. “By making shipping so cheap that industry could locate factories far from its customers, the container paved the way for Asia to become the world’s workshop and brought consumers a previously unimaginable variety of low-cost products from around the globe,” he writes.
Whenever I read this story from Levinson’s book, my blood runs cold, because of its analogy to today’s strongly emerging software containers and their growing importance for CAE, for the whole software lifecycle — each phase, from design, coding, testing — to software release, distribution, access and use, through to support and maintenance, and especially for engineers and their applications.
My Own 40 Years in CAE
The last 40 years has seen a continuous struggle of our community with CAE. Let me tell you how I started. In 1976 I started my first job as a computer scientist at the Max Planck Institute for Plasmaphysics in Munich, developing my first FORTRAN program for magneto-hydrodynamics plasma simulations on a 3-MFLOPS IBM 360/91. Three years later, at the German Aerospace Center (DLR) in Gottingen, I was involved in the benchmarking and acquisition of DLR’s first Cray-1S supercomputer, which marked my entry into vector computing. In 1980, my team broke the 50-MFLOPS with a speedup of 20 over DLR’s IBM 3081 mainframe computer, with fluid dynamics simulations for a nonlinear convective flow and for a direct Monte-Carlo simulation of the von-Karman vortex street. To get to that level of performance, we had to hand-optimize the programs, which took us several troublesome months. That was CAE then.
Ubiquitous Computing and Xerox PARC’s Mark Weiser
When we use the word “ubiquitous,” we mean everywhere, omnipresent, pervasive, universal, and all-over. Here I’d like to quote Mark Weiser from Xerox PARC who wrote in 1988:
“Ubiquitous computing names the third wave in computing, just now beginning. First were mainframes, each shared by lots of people. Now we are in the personal computing era, person and machine staring uneasily at each other across the desktop. Next comes ubiquitous computing, or the age of calm technology, when technology recedes into the background of our lives.”
Weiser looks at ubiquitous computing with the eyes of engineers and scientists. According to Weiser, these users shouldn’t care about the “engine” under the hood. All they care is about “driving” safely, reliably and easily: Getting in the car, starting the engine, pulling out into traffic, and reaching point B.
Toward Ubiquitous CAE
Now to translate the driving analogy into ubiquitous CAE. Very simplified CAE technology is split into two parts: software and hardware. Both today are immensely complex, and their mutual interaction is highly sophisticated. For CAE to be ubiquitous, Weiser suggests making it disappear into the background of our (business) lives, from the end user’s point of view. Indeed, in the last decade, we were able to make a big step toward reaching this goal: We made access and use of CAE codes relatively easy by developing user-friendly interfaces, with its trends toward what some people call “appification;” and we abstracted the application layer from the physical architecture underneath, through server virtualization with virtual machines (VMs). This achievement came with great benefits, especially for the IT folks, but for the end users too. We can now provision servers faster, enhance security, reduce hardware vendor lock-in, increase uptime, improve disaster recovery, isolate applications, extend the life of older applications and help move things to the cloud easily. So, with server virtualization we came quite close to ubiquitous computing.
The Next Step: CAE Software Containers
But, server virtualization did not really gain a foothold in CAE, especially for highly parallel CAE applications requiring low latency and high-bandwidth inter-process communication. And multi-tenant servers, with VMs competing among each other for hardware resources such as input/output, memory and network, often slow down CAE application performance.
Because VMs failed to show presence in CAE, the challenges of software distribution, administration and maintenance kept CAE systems locked up in closets, available to only a select few. In fact, the US Council of Competitiveness estimates that only about 5% of all engineers are using high-performance servers for their CAE simulations, the other 95% just use their workstations.In 2013, Docker Linux Containers saw the light of day. The key practical difference between Docker and VMs is that Docker is a Linux-based system that makes use of a userspace interface for the Linux kernel containment features. Another difference is that rather than being a self-contained system in its own right, a Docker container shares the Linux kernel with the operating system running the host machine. It also shares the kernel with other containers that are running on the host machine. These features make Docker containers extremely lightweight, and well-suited for CAE, in principle.
Still, it took us at UberCloud about a year to develop (based on micro-service Docker container technology) the macro-service production-ready counterpart for CAE, plus enhancing and testing it with a dozen of CAE applications and with engineering workflows, on about a dozen different single- and multi-node cloud resources. These high-performance, interactive software containers, whether they’re on-premise, on public or on private clouds, bring a number of core benefits to the otherwise traditional HPC (high-performance computing) environments with the goal to make HPC ubiquitous:
1. Packageability: Bundle applications together with libraries and configuration files.
A container image bundles the needed libraries and tools as well as the application code and the necessary configuration for these components to work together seamlessly. There is no need to install software or tools on the host compute environment, since the ready-to-run container image has all the required components. The challenges regarding library dependencies, version conflicts and configuration challenges disappear, as do the huge replication and duplication efforts in our community when it comes to deploying CAE software.
2. Portability: Build container images once, deploy them rapidly in various infrastructures.
Having a single container image makes it easy for the workload to be rapidly deployed and moved from host to host, between development and production environments, and to other computing facilities easily. The container allows the end user to select the appropriate environment such as a public cloud, a private cloud or an on-premise server. There is no need to install new components or perform setup steps when using another host.
3. Accessibility: Bundle tools such as SSH (Secure Shell) into the container for easy access.
The container is setup to provide easy access via tools such as VNC (virtual network computing) for remote desktop sharing. In addition, containers running on computing nodes enable both end users and administrators to have a consistent implementation regardless of the underlying compute environment.
4. Usability: Provide familiar user interfaces and user tools with the application.
The container has only the required components to run the application. By eliminating other tools and middleware, the work environment is simplified and the usability is improved. The ability to provide a full-featured desktop increases usability (especially for pre- and post-processing steps) and reduces training needs. Further, the CAE containers can be used together with a resource manager such as Slurm or Grid Engine, increasing the usability even further by eliminating many administration tasks.
In addition, the lightweight nature of the CAE container suggests a low performance overhead. Our own performance tests with real applications on several multi-host, multi-container systems demonstrate that there is no significant overhead for running high performance workloads as a CAE container.
Current State and Conclusions
During the past two years, UberCloud has successfully built CAE containers for software from ANSYS (Fluent, CFX, Icepak, Electromagnetics, Mechanical, LS-DYNA, DesignModeler and Workbench), CD-adapco STAR-CCM+, COMSOL Multiphysics, NICE DCV, Numeca FINE/Marine and FINE/Turbo, OpenFOAM, PSPP, Red Cedar’s HEEDS), Scilab, Gromacs and more. These application containers are now running on cloud resources from Advania, Amazon AWS, CPU 24/7, Microsoft Azure, Nephoscale, OzenCloud and others.
Together with recent advances in application software and in high performance hardware technologies, the advent of lightweight pervasive, packageable, portable, scalable, interactive, easy-to-access CAE containers running seamlessly on workstations, servers and any cloud, is bringing us ever closer to what Intel calls the democratization of high performance computing, and to the age where CAE “technology recedes into the background of our lives.”
This commentary is the opinion of Wolfgang Gentzsch, president and co-founder of The UberCloud. For more information visit theubercloud.com.
More Info:
- UberCloud Application Software Containers
- Download 2015 UberCloud Compendium of cases studies, sponsored by Intel and Desktop Engineering.
- Software Providers – Building Your Own Software-as-a-Service Business in the Cloud