The Human Side of Simulation

Incorporating human body models as part of simulation-driven design improves design outcomes, resulting in highly personalized, safer products.

Incorporating human body models as part of simulation-driven design improves design outcomes, resulting in highly personalized, safer products.

The Human Brain Project employs simulation to replicate the brain and its workings in a virtual world. Simulation takes place at several levels, ranging from the molecular through the subcellular to cellular and up to the whole organ. Image courtesy of The Human Brain Project.


More robust and accessible computing power, coupled with advanced simulation and artificial intelligence (AI), is handing engineers another advantage for digital design practices: The ability to tap into realistic human body models in early design exploration, resulting in better optimized, highly personalized and safer products.

While inserting the human body into simulation isn’t necessarily a new practice, the sophistication and realism of the human body models that can help inform design exploration has grown significantly. 

Traditionally, engineers could insert a 3D representation of a human body shape—think mannequins in CAD software—into product designs such as airplane cabins or cars to gauge interaction with electronics. This helps determine the impact of crashes or to calibrate spacing considerations to bolster the utility and comfort of a design. 

That dynamic has rapidly evolved to include full system models and highly personalized models of a particular individual’s human form to take design exploration to the next level, ushering in a new era of human-centered design.

As human models and modeling capabilities become more robust, as well as readily accessible to non-expert engineers, their utility is growing. Human models are being used to ensure comfort and safety of products, whether an interior car cabin or the expanse of an airport terminal. 

The combination of human modeling and simulation is getting a significant amount of traction in various medical applications as a way to create bespoke designs for medical devices like replacement hips or arterial stents. The technology is also being leveraged to create more efficient and safer work environments, particularly mapping out the placement of people and equipment on the plant floor as part of manufacturing process optimization.

“We need to consider the human throughout the whole engineering process—not just the object we’re creating and how people interact with it, but also understanding how to assemble and service that product,” says Ulrich Raschke, director of human simulation products for Siemens Digital Industries Software. “Having human simulation technology in the mix aids in human-centered design, which considers the human from the start so the design inherently accommodates the user.”

Accurate hand models help designers consider user interaction and product safety throughout the design process. Image courtesy of Siemens.

Personalized Medical Applications

For some time, engineers have inserted human shapes into the design process to gauge interactions—for example, experimenting with the position of an athlete on a road bike in simulations to shave time off a downhill race to gain competitive advantage. 

As the models and simulation capabilities evolved, engineers began tapping the technology to gauge the comfort or usability of a design—for instance, to explore the pressure points of a car or airplane seat on a passenger at different points of their body. This delivers a deeper set of useful information to the engineer as they iterate designs. 

Today that process is evolving further, incorporating scanned images of real individuals as the base foundation for a human model that delivers a high degree of accuracy and realism, particularly for determining what happens inside the body as a result of standard interactions. 

“Human body modeling is more powerful today because we can model what’s happening in the body and use it to predict outcomes,” notes Thierry Marchal, global director of the healthcare industry at Ansys. “With an external shape that you just place in a car, you can’t calculate what’s happening in the body—what pressure or loads are applied to the neck or spine.” 

More powerful modeling capabilities can predict the outcomes of specific medical interventions for individual patients, allowing for more personalized treatments. Image courtesy of Ansys.

When leveraging digital design processes to architect medical devices and related products, mimicking what happens inside the body in response to product performance is critical. 

Offerings like Synopsys’ Simpleware human body models provide realistic human anatomical representations to be used in simulations to improve the fit of devices, to mimic the mechanical wear of devices once implanted or to understand the electromagnetic effects of a device on the surrounding anatomy. 

Beyond generic human body models, engineering teams are beginning to make use of scanning technologies and software tools that translate those images into a data format that can be leveraged and manipulated through simulation. As a result, designers can predict how an specific implant or medical device design will perform in a specific patient based on their individual physiology and treatment plan.

At medical device maker Kejako, for example, a 3D parametric full-eye model was developed using COMSOL multiphysics simulation software to provide insights into the root cause of the eye’s degeneration over time as the natural aging process causes farsightedness. The full eye model covers a lot of physics ground, including fluidics related to the aqueous humour, the optical behavior of the lens and cornea material, and the refractive index replicated by modeling muscle ligaments. 

The solution is more than a generic eye model—it also has the potential to provide a personalized treatment plan for individual patients, which is important because everyone has different physiology in addition to experiencing various levels of presbyopia—the condition that causes the deterioration.

“Creating a model of the human eye turned out to be a complicated simulation combining structured mechanics and fluid flow,” explains Bjorn Sjordan, vice president of product management at COMSOL. “It helped the team understand farsightedness more and helps determine what surgery is optimal—LASIK or corrective lens.”

Medical device makers, researchers and medical professionals are jumping on the use of simulation, and human body models, in particular, as the pendulum swings toward more personalized treatments and individualized medicine. In the traditional design process, designers create mockups of products and go through a lengthy trial-and-error design process using benchmark physical tests on prototypes built in silicone rubber or other materials. 

Today, much of that same exercise can be performed computationally, minimizing the number of physical tests, reducing reliance on animal testing and speeding up the process of clinical trials. In fact, the FDA has already recognized the public health benefits of modeling and simulation software for enabling in silico clinical trials and, in select cases, is behind use of them to replace or greatly reduce reliance on human and animal clinical trials.

“Instead of putting a device or stent into 200 or 300 people, you can take a scan of a person or animal, put the implant in a virtual cohort of different sizes and shapes of anatomy and cover a large segment of the population upfront,” says Kristian Debus, vice president and leader of the Life Sciences team at Thornton Tomasetti, an engineering consulting firm. “At the end of the day, we’re trying to reduce the number of patients we use in clinical trial while reducing the time and cost of those trials.”

Human organ models like these computational fluid dynamics simulations can be used in the field of respiratory diagnostics or to study delivery of inhalable drugs to the lung. Image courtesy of Thornton Tomasetti Life Sciences.

Dassault Systèmes is so committed to the idea of human modeling, it announced a strategy last February to create a virtual twin experience of the human body to transform medical and wellness treatments. Much like engineers have been able to build virtual twins of giant aircraft or carrier ships using the 3DEXPERIENCE platform, the company is now turning attention to modeling the entire human body, from DNA to organs, using a combination of modeling, simulation, information intelligence and collaboration.

“We want to understand how the whole human system operates,” says Ales Alajbegovic, vice president, SIMULIA industry process success and services at Dassault Systèmes. “We’ve been modeling very complex products, and we’ve reached the point where we want to go one step further and simulate the most complex machine on the planet, which is us.”

The virtual twin of the human body builds off of Dassault’s Living Heart initiative, a research effort that develops and validates highly accurate and personalized digital human heart models. The heart models are designed to serve as a foundation for cardiovascular in silico medicine, including evaluating and testing pacemaker leads and other cardiovascular devices, and eventually using virtual patients constructed via computational models and simulation to improve the efficiency of clinical trials of new devices.

The Living Heart model, and eventual full human body model, can also be used in product development to provide better insight as to what might happen to a human during a car crash far beyond the insights provided by physical crash dummies or even simulated versions. 

“Crash test dummies look like us on the outside, but they don’t represent us very well on the inside,” notes Steve Levine, senior director of virtual human modeling at Dassault. As vehicle innovation advances in areas like autonomous vehicles, the crash scenarios become more complex and therefore the physical loads on the body and what happens internally is different than in the past when you assume a uniform seating position, Levine adds.

Dassault Systèmes is also a partner in the Human Brain Project, which includes the Brain Simulation Platform. This similar digital twin project is attempting to model the human brain for research, medical and product development applications. 

Using simulation technology and AI, the researchers aim to build virtual models of a brain and brain activity, including patient-specific representations, allowing doctors to mimic and test intervention procedures prior to surgeries to ensure the best outcomes. 

Viktor Jirsa, director of the Institute of Neurosciences and Systems at Aix-Marseille University, likens the concept to a systems modeling exercise with outcomes similar to testing products in a flight simulator.

“When you simulate the flying of an Airbus, you are not just testing the functioning of the engine; you put the Airbus in a flight simulator where there is rain, wind and aerodynamics,” he explains. “That’s what we are doing with the virtual brain. We are taking all the brain signals and physical aspects into account.”

Injecting a Human Element into Manufacturing

Beyond medical use cases, human body models and simulations are also playing a prominent role in process design and manufacturing. Siemens’ Process Simulate Human technology (known as Jack) is incorporated in various platforms, including NX, Teamcenter and Tecnomatix, to provide engineers with insights to improve the ergonomics of products designs or to refine industrial tasks. 

The software enables engineers to inject human body models that match its worker population into a digital twin representation of a factory floor or a product design to explore comfort, injury risk, line of sight and fatigue limits early in the lifecycle when it is less expensive and time-consuming to make changes, according to Raschke.

While the technology has been around for a while, improved predictive models and streamlined interfaces make the capabilities more accessible to a wider population of engineers. 

“There’s been an evolution of predictive capabilities and infrastructure to allow users to more easily do analysis than what was possible before,” he explains. “As opposed to having a few experts familiar with the technology, now a larger community of engineers can take advantage.”

Given the realities of the workplace due to the COVID-19 pandemic, human modeling and simulation also have relevance for designing workspaces and production lines with an eye toward employee safety and stopping the virus’ spread. Siemens recently announced a new workplace distancing solution based on its SIMATIC Real Time Locating Systems (RTLS) technology and its Xcelerator portfolio of engineering, operational, industrial Internet of Things and cloud solutions. 

The platform lets companies model employee interactions, including where they reside on a production line, and provide real-time visual feedback if people get too close. There is also an ability to identify “hot spots” so manufacturers can reconfigure layouts to mitigate potential risk scenarios.

Because of this and other scenarios, interest in human modeling and simulation is likely to remain on the rise. 

“There is a need to understand how containment is propagated in environments from homes, but especially hospitals, vehicle cabins and airplane cabins,” says Dassault’s Alajbegovic. “The number of requirements we now see in those areas is skyrocketing.”

More Ansys Coverage

More COMSOL Coverage

COMSOL Company Profile

More Siemens Digital Industries Software Coverage

Siemens Digital Industries Software Company Profile

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Beth Stackpole's avatar
Beth Stackpole

Beth Stackpole is a contributing editor to Digital Engineering. Send e-mail about this article to [email protected].

Follow DE
#24298