Collaboration Revolution in AR/VR
Immersive technologies challenge established PLM practices.
Engineering Resource Center News
Engineering Resource Center Resources
Dell
Latest News
November 8, 2019
For the conceptual design phase of Collaborative Product Development (CPD), the digital model of the product remains a key facilitator. The process revolves around the ability to conceive ideas in 3D, refine them easily in CAD modeling programs, and share them with remote suppliers and partners to identify and fix manufacturing issues beforehand.
Early on, engineers found out passing large CAD files back and forth via email wasn’t an efficient way to collaborate. As the number of collaborators and iterations increased, so did the number of scattered digital versions of the same product, all uncontrolled, all ingesting changes and revisions that couldn’t be tracked.
The challenge gave birth to the rise of lightweight 3D file sharing and viewing technologies, such as Techsoft’s HOOPS, Actify SpinFire (uses HOOPS components), Siemens’s JT and others. It also prompted many CAD developers to add lightweight viewing, annotation and collaboration tools into their product offerings. While others remain as the ancillary add-ons they were intended to be, some such as eDrawings from SolidWorks stand apart as robust products in their own right due to ease of use and richness of features. It was also one of the earliest of such programs to incorporate augmented reality (AR) features into the mobile version.
Recently, with the emerging AR/VR (augmented reality, virtual reality) technologies entering the collaboration field, model visualization and model-based collaboration enter a new phase.
From Browser to AR/VR
Collaboration’s migration from browsers and 2D monitors to AR/VR goggles and head-mounted displays (HMDs) is ongoing, with lots of room for improvement. Some AR/VR app developers simply transfer the collaboration tools from the browser-centric paradigm into the new medium, resulting in floating menus and pallets that don’t take advantage of the more natural gestures and interactions possible with ARVR.
Nevertheless, the ability for collaborators to virtually stand next to the digital model in real-world scale gives much better insights. For example, with automotive or aerospace design, being able to visualize the new vehicle concept in true scale allows collaborators to conduct aesthetics and ergonomic studies in ways that weren’t possible with 2D monitors. The freedom of movement possible in AR/VR allows engineers to get a much better understanding of the constraints and appeals of the new vehicles concepts.
NVIDIA’s Holodeck, announced at the NVIDIA GPU Technology Conference (GTC) in 2017, is one of the early VR-based multi-user collaboration systems to put this idea to use. With Holodeck, multiple users appear as avatars inside VR, with the freedom of movement to walk around and inspect the digital model as though it were physically present.
The GPU maker describes the Holodeck as “a virtual reality (VR) innovation platform that brings designers, peers and stakeholders together from anywhere in the world to build and explore creations in a highly realistic, collaborative and physically simulated VR environment.” Early adopters include Toyota and NASA, among others.
Look Here to Right-Click
Earlier HMDs rely on game controllers, limiting the users to interact with the digital model through pointing and clicking. More recent devices, such as the Microsoft Hololens 2, include hand tracking, allowing users to employ natural movements (such as gripping objects or using fingers to push virtual buttons) to rotate, handle or manipulate digital objects in VR.
With Varjo’s VR-2, which debuted at the xRS Conference (San Francisco, CA, Oct 16-18) dedicated to extended reality technologies, users can use eyesight itself as the pointer to activate context-sensitive options and menu items. This frees up the users’ hands for other operations, such as using replica equipment to practice certain complex repair or maintenance procedures.
All these developments add to the potentials of CPD, where collaborators can observe the natural posture, reach and comfort of test subjects in simulated usage scenarios.
Many VR HMDs rely on a powerful workstation for computing. Cord-free HMDs with built-in processors are still an exception to the norm. Hololens 2 is cord-free but it connects to the powering system via Wi-Fi. Due to the need to produce interactive high-resolution visuals, both Microsoft Hololens 2 and Varjo’s VR-2 require systems with professional VR-ready GPUs to operate.
PLM via VR
Leading PLM vendors such as Siemens, PTC and Dassault Systemes have begun exploring ways to deliver classic collaboration solutions in AR/VR, with the digital twins as the stand-in for real-world products.
Siemens states that its PLM software “Teamcenter is compatible with the latest VR devices, including the HTC Vive, Oculus Rift and the zSpace device, so you can perform form, fit and function studies, and conduct design reviews, by immersing yourself in the virtual world of the product’s digital twin.”
With AR-focused Vuforia brand in its portfolio, PTC began looking for ways to bring its Windchill PLM tools into AR. In a paper titled, “Powering Collaboration with Augmented Reality,” PTC writes, “With a few clicks you can superimpose a 3D representation of your product against a real-world backdrop, such as a factory floor, any time you wish. Changed your design? Just click, republish, and explore it again in context and at scale.”
While PLM tasks such as file management, version control, access control and change order submission will likely remain in desktop applications, AR/VR now gives engineers the ability to immediately visualize or experience the proposed changes in a lifelike environment to make more informed decisions.
Open Questions
Many collaboration tools—such as redlining tools and type-entry interface for annotations—are optimized for the lightweight model viewers in desktop and web-enabled environments. Duplicating them for AR/VR, as some app developers have done, gives the familiar form factor in the new technology, but also shortchanges the rich potentials.
What, then, is the best way to indicate a desired change in a digital twin in AR? Or the most efficient way for a group of engineers to divide and conquer a large assembly in VR without overriding one another’s changes? Perhaps more important, what are the functions that were left out of previous collaboration tools due to the limitations of the 2D monitors and browser-based viewers? Is there an opportunity to tackle them in AR/VR? These questions remain unsettled, as app developers have only begun to reinvent the whole collaborative workflow for the new immersive technologies.