DE · Topics · Test · Test

Moving into "Intelligent Engineering"

The Center for Advanced Engineering Environments assists researchers, industry and government partners with product creation and testing.

The Center for Advanced Engineering Environments assists researchers, industry and government partners with product creation and testing.

By Debbie Sniderman

 
Virtual reality tools and new methods of interfacing enable groups of people to interact with a model simultaneously in an immersive classroom setting.

The Center for Advanced Engineering Environments (CAEE) at Old Dominion University in Hampton, VA, serves as a pathfinder and a focal point for research activities pertaining to advanced learning environments—and collaborative, distributed knowledge discovery and exploitation. It identifies the direction of aeronautical and space research, and demonstrates and transfers results to engineers and researchers.

CAEE’s advanced visualization equipment ranges from autostereoscopic displays that allow users to view 3D images without headgear or glasses, to an EON TouchLight display that uses gestures and multimodal interactions. It also includes an EON IPresence tele-immersion facility, EON Icatcher 3D stereo projector,  and a multi-user, touch- and gesture-activated tabletop display that is debris-tolerant.

Technologies that link virtual and physical worlds are used to create immersive, interactive 3D virtual worlds with augmented reality to get the most out of visual simulations. When used effectively, says Ahmed Noor,  Ph.D., the center’s director and professor of Modeling, Simulation and Visualization Engineering at Old Dominion University, they can automate several activities and significantly enhance an engineer’s or user’s productivity,  creativity and innovation. They also allow collaboration with a wide range of people.

“As the trends of distributed collaboration, large-scale integration of computing resources, enterprise tools, facilities and processes continue, a fundamental paradigm shift will occur in the virtual product creation,” he adds. “Future high-tech systems will be complex systems-of-systems, developed through just-in-time collaborations of globally distributed teams, linked seamlessly by an infrastructure of networked devices,  tools, facilities and processes.”

With this as motivation, Noor says researchers at the center are trying to build an intelligent, adaptive, cyber-physical ecosystem that has intelligent knowledge discovery so product planners and developers can visualize, assemble, test and optimize products and production processes very quickly.

Assisting NASA
Virtual structural test facilities and wind tunnels are just two examples of the type of virtual facilities the center uses for elaborate structural, material and aerodynamic testing. Simulations of NASA Langley’s Subsonic and Full-scale Wind Tunnels were built to help NASA understand how simulations could help quickly converge on optimal component and system designs. They wanted to validate the simulations and understand what they could obtain from physical tests that could not easily be obtained from the simulations.

 
Avatars are available to answer a wide variety of questions about the model, the simulation, or procedures, to help users complete virtual testing.

By using simulations of these wind tunnels and major structural test facilities in the virtual world, users can obtain procedural and operational training—and complete intelligent design of experiments to run in the facilities. The simulations also help them understand response quantities that can be measured during testing, the types of materials and sensors required, and their placement on test articles before arriving at the facility. After completing pre-test work, virtual tests can be run as often as required to understand the response before going to the physical test facility,  where testing time is expensive.

  CAEE also completed a simulation of a large structural test facility for NASA: the Combined Loads Testing System (COLTS). The COLTS simulation allowed a user to run a virtual test, then adjust the experiment,  the boundary conditions, or change the test article material itself and re-run the test. Virtual testing with facility simulations is less expensive than performing multiple iterations of physical experiments, according to Noor. Once the simulation system is built, virtual tests are available at the relatively low cost of labor.

The Role of Immersive Displays
NASA found that traditional training methods—using courses,  lectures and slides—were not exciting for younger audiences, and some older audiences felt intimidated when new technologies were introduced. To solve this problem, CAEE proposed using the most advanced technologies available that will engage the younger audiences, along with very simple interfaces for the older audiences, and demonstrated the concept using a number of simulations and design systems.

The interface of the center’s TouchLight display was designed to be very simple to interact with, using hand gestures, voice commands or a smart mobile device. There is no user manual to read. Standing in front of the display containing simulation models, the system will list possible interactions. Users can ask “what are the models you have now in the system” to display a list of models. Commands such as “show me this model” or “what commands can I use with this model” are understood, and users can change command words to those they are more comfortable with, including foreign words.

The center has also developed technology that adds intelligence to simulation and design systems, providing intelligent virtual assistants as advisers. Not only can these intelligent agents perform mundane tasks such as automating model generation, they can also lead a user through solving a problem, or help a technician running the test facility during a physical test.

Eventually, facility technicians or users will be able to use handheld devices to ask a virtual agent for assistance during any point of the testing, such as retrieving a specification or querying a manual for operational instructions. Prototypes were developed more than five years ago,  long before the invention of Siri used with the iPhone 4S.

 
The center has developed tools such as this TouchLight Display for interacting with models and simulations, using brain and gestural commands.

The CAEE has developed intelligent avatars—“cognitive”  agents that use novel interaction technologies. Brain-based interfaces that combine brain signals to register thoughts, multimodal interfaces that include gestures, and facial expression recognition are used to facilitate the natural interaction of the user with the simulation system. One prototype notices when users look puzzled, and offers assistance. Noor says the avatar “feels” when the user needs help. He says he believes that more simulation systems should include these types of avatars: “They can significantly enhance productivity and engage the user.”

Digital Simulation Complements Physical Testing
Digital and visual simulation tools are being used to accelerate the development of powerful systems, far beyond automatic model generation and self-designing components.

“We are moving from what we call the Information Age to another era, the Intelligence Era, which is essentially the result of advances in artificial intelligence going way beyond expert systems of the past, to artificial general intelligence trying to mimic cognitive characteristics of humans in our systems,” Noor says. “In the Intelligence Era, the convergence of facilities, technologies and devices will have an impact on every engineering field and activity.”

He points out that it is important to have a virtual representation of the entire value chain—the ability to go through the entire process, from concept and product development through operation during the expected lifetime of a product.

“The simulation should also include upgrade or disposal in a way that will not adversely affect the environment,” Noor continues. “We should capture and reuse the multi-disciplinary knowledge of the organization that is developing the system, and consider how to effectively apply this knowledge to other products.”

Digital factories of the future will be able to create much bigger components than the 3D printers of today, he adds: “We will add intelligence to those systems, so they can automatically select the production process and test it in a virtual world, like Second Life.”

But Noor stresses that this doesn’t replace physical testing.
“Physical testing will be reduced, and it will be done in much more intelligent ways than are done today,” he concludes. “Simulations will be used to intelligently design the experiments, and we should think about the simulation and physical testing as two components of an integrated system that looks for ‘predictive engineering’ to obtain more knowledge than we are today,  by thinking more intelligently about the role that each could play.”

Debbie Sniderman is an engineer, writer and consultant in manufacturing and R&D. Contact her at VIVLLC.com.

 

MORE INFO
aee.odu
aee.odu.edu/facilities_visualization
SpringerLink
EONreality

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

DE Editors's avatar
DE Editors

DE’s editors contribute news and new product announcements to Digital Engineering.
Press releases may be sent to them via [email protected].

Follow DE
#2207