Are You Ready for Extended Reality (XR)?

Identifying the right use case is key to getting the most out of enterprise augmented reality.

Identifying the right use case is key to getting the most out of enterprise augmented reality.

Image courtesy of Varjo and Volvo.


In Volvo’s research facilities in Sweden, test drivers are becoming accustomed to operating future car models that only exist as pixels. The car on the road is a physical vehicle; but what the drivers see is something else. Outfitted with Varjo’s XR-1 augmented reality (AR) device, they see the interior of a car not yet manufactured.

“With this approach, we can, for the purpose of evaluation, use different virtual display options of the dashboard to see how drivers perceive them while driving the car. So, wearing the XR-1 headset, the driver 'sees' the virtual dashboard in the car, which in reality does not yet exist,” Volvo’s press office explained in an email to Digital Engineering.

It’s just one example of how AR fuses physical and virtual products, and enables design review and testing that is otherwise impossible. Ray-traced, rendered videos give you impressive visuals. Software-based simulation helps you figure out how a product might fail. But AR lets you experience a virtual product as though it were physically present. From automotive and aerospace to consumer goods, many manufacturers are looking at extended reality as the new frontier in product development.

But AR enterprise adoption is not plug-and-play. Without adequate preparation, projects can easily go awry. For this article, we spoke to those who have gone through the journey to understand what it takes.

Driving Pixels

Headquartered in Helsinki, Varjo launched its first virtual reality headset called VR-1 in January this year, calling it “the first human-eye resolution VR.” In its announcement, the company wrote that VR-1 has “a resolution of more than 60 pixels per degree, which is 20X+ higher than any other VR headset currently on the market. VR-1 also comes with the world’s most advanced integrated eye tracking, enabling high-precision analytics and interaction with human-eye resolution VR content.” In May, the company launched its first AR headset, called XR-1 Developer Edition, using the term XR for extended reality.

The eye-tracking technology in VR-1 and XR-1 allows the user to use their eyesight—and where they choose to focus—as a pointer. Without the need to use controllers to select and execute commands, the system leaves the user’s hands free for other tasks. In driving simulation, the feature is particularly important as the test driver needs his or her hands to control the steering wheel.

“The highly accurate eye-tracking technology embedded inside the XR-1 makes it easy to assess how drivers use a new functionality and whether they are distracted in any way while driving,” said Volvo’s press office. “This technology-based approach to measuring distraction levels ensures that Volvo Cars can develop new features without causing additional distraction. Therefore, wearing the XR-1 headset while actually driving gives us real insights, which we can take into the development of our cars.”

A test driver for Volvo wears a Varjo XR-1 AR headset to see the virtual dashboard of a new model under development. Image courtesy of Varjo and Volvo.

Volvo is not just a customer of Varjo, but also an investor. The Volvo Cars Tech Fund is supporting the startup headset maker’s ongoing developments. “For using mixed reality (MR) in product development most optimally, it should be integrated into existing workflows to enhance and improve existing systems rather than creating completely new ones from scratch,” advised Volvo.

Props Make a Difference

While Elizabeth Baron was working as the immersive reality technical specialist for Ford, she oversaw the creation of an VR setup that let engineers perform surface highlights on vehicles that had not yet been built. To replicate the way automotive engineers would shine a light on a car to observe the reflections, she assembled suitable VR headsets with position tracking.

To make the VR session much more realistic, Baron made one significant tweak. “I went to the dollar store nearby, bought a bunch of 12-in. flashlights, and tracked them,” she recalls. “In most cases, it’s advantageous to have a physical object that represents the real device the user would naturally be using for the task.”

In shape, proportion and function, the cheap flashlights were much closer to the real equipment the engineers would use for their routine surface highlight tests. The modification, which came at a small cost, made the VR setup so convincing that, when the session was over, the users were often attempting to switch off the dummy flashlights, which were never turned on to begin with. “We always got a kick out of watching them do that,” Baron says with a chuckle.

Baron left Ford in the beginning of 2019 to start her own firm, Immersionary Enterprises. She recently started a collaboration with Silverdraft Supercomputing. “I want to work with enterprise clients on their XR journey, to work on both their culture and technology to enable better collaboration,” she says.

The New Medium for Collaboration

If you peel off the glossy visuals in many multi-user MR applications, you’ll often find underneath the all-too-familiar features from WebEx, Skype and FaceTime. At its core, the NVIDIA Holodeck is nothing but a massive group chat environment with ray-traced visuals. Certainly, inside the Holodeck, participants get a much deeper understanding of the design, engineering and manufacturing issues they face because they can inspect a life-size digital replica of the product as though they were standing in front of it. But the tools for voice communication, text messaging and screenshot snapping are nearly identical to those found in Skype or Facebook messenger. This is both good and bad.

It’s good because the similarities allow users to ease into the new medium without significant culture shock. But bad because the same similarities may make users miss the fundamentally different ways an XR workflow needs to be supported.

It’s fairly straightforward to set up and support a multi-user collaboration system with the backend mechanism to automatically capture and archive the sessions on a public or private cloud. But setting up and supporting a similar type of workflow in AR, VR or MR, however, has different storage requirements, due to the large amount of 3D data and photorealistic video streams involved.

Baron designed a global immersive collaboration paradigm in 2012 and did the first immersive review between Michigan and Australia in 2012. “I saved a lot of XR discussions to record what happened in the meeting, thinking people would go back to review them, but nobody did,” reveals Baron. “So we learned that saving a summary of what was learned during the session is a better strategy.”

Identifying the Right Use Case

Realtors like to quip, “There are three important aspects to buying and selling properties: Location, location and location.” Ask David Nedohin, president and cofounder of Scope AR, about AR adoption and you’ll get a similar response.

“The three most important aspects to successful enterprise AR engagement are the right use case, use case and use case,” he says. “Companies have to figure out the use cases that make the most sense. After that, then they can align the necessary workflow with the current technology available and figure out how to support that.”

Epson is among the device makers betting on remote expert assistance as an area for AR application. Recently it launched the Epson Moverio Assist on-demand services. Image courtesy of Epson.

One general use case Scope AR is betting on is remote assistance. To enable it, the company offers an integrated AR content-authoring platform called WorkLink. The platform allows organizations to create and publish AR-powered work instructions.

“The software facilitates real-time remote assistance video calls between a technician and a remote expert. While on a live video call, the expert can see the real-world view of a colleague on the shop floor, for example, and walk him or her through a repair or maintenance procedure by annotating the view with animated, 3D digital content or by dropping in a set of pre-built AR instructions for the technician to follow,” explains Nedohin.

Although Scope AR’s software remains hardware agnostic, Nedohin believes certain AR gear offers clear advantages over others. “To me, without a doubt, the Microsoft HoloLens 2 with its computer vision is one of the best devices for our applications,” he notes. “Aside from camera and processor power, it largely comes down to the use of top-of-the-line computer vision technology to map out the natural environment for virtual object placement.”

An Easy Entry to AR

Hardware makers such as Lenovo and Epson are also gunning for remote expert assistance as a low-barrier entry to AR. At the recent Augmented World Expo (AWE, Santa Clara, CA, in late May), Lenovo launched its first enterprise-targeted AR glasses, called ThinkReality A6. To attract application developers, the company also released the ThinkReality software platform, which includes sample apps for AR/VR applications. One of them is a remote expert communication app.

Around the same time, Epson launched Moverio Assist, a System-as-a-Service product to set up and deliver remote expert assistance via Moverio AR glasses (specifically for Moverio BT-300, Moverio BT-350 and Moverio BT-350 A). Users buy the supported AR glasses and supply their own expertise, but Epson provides the cloud-hosted communication pipeline to let the field technician and the expert connect, troubleshoot and share files.

“For remote assistance, you just need a set of basic features: front- and back-facing cameras, two-way audio and file sharing. We see this as an easy onramp to get companies up and running in AR. It’s self-service, no onboarding process,” explains Leon Laroue, technical product manager for Epson Moverio.

With Epson’s Moverio AR glasses, the field technician may from time to time use a portable smartphone-size pointer to select and open files, but for the most part they can work hands free. Compared to the clumsy use of a smartphone’s video camera to transmit and work on machinery at the same time, the AR-powered approach is a better alternative.

“We built the Moverio Assist with scale in mind, so it doesn’t matter if you’re a company with five or 5,000 users. It’s ideal for the industries where, if your machines are down, appliances need to be repaired or equipment needs to be installed, downtime is measured in hundreds of thousands of dollars,” says Laroue.

With an AR-based remote expert program, the expert sitting behind a computer can guide a field technician to perform certain complex tasks that require a deeper level of knowledge. This approach cuts down on the expert’s onsite visits, and allows him or her to service more sites and handle more cases.

Hands Free, Gesture- and Geometry-Aware

Over the years, AR and VR gear has improved in form factor, resolution and function. The latest generation is much lighter, making it easier to wear and work for an extended period. Many now include or are striving to include hand tracking, gesture recognition and environment awareness.

At February’s Mobile World Congress in Barcelona where the Microsoft HoloLens 2 debuted, perhaps one of the most groundbreaking moments was when Julia Schwarz, Microsoft’s principal software engineer for HoloLens 2, played a virtual piano by tickling the invisible ivories with her real fingers.

With the HoloLens 2, Microsoft incorporated gaze and air tap functions. This allows the user to use their head gaze as the targeting mechanism (what you would normally do with a mouse pointer on a flat screen); and the tap gesture in the air as the trigger mechanism (the equivalent of a mouse click on a flat screen).

“Tracking technology has blossomed, and today’s headsets give you better pixel density. They’re a lot lighter. They know where you are. They’re much better at anchoring virtual things on real surfaces. When I was at Ford, around 2010 or 2011, I used only mocap,” recalls Baron. In addition to mocap, or motion capture, Baron later integrated more tracking technologies such as SteamVR.

Motion capture allows you to capture the physical action of actors and map them onto digital avatars. Although it results in highly realistic physical movements, it’s also costly due to its complex setup and space requirements. Later, Leap Motion’s small motion detector (price beginning around $90) became an easy and affordable way to implement hand-gesture recognition. In May, the UK-based Ultrahaptics snatched up Leap Motion for $30 million.

Today’s AR and VR gear with built-in depth cameras, motion sensors and location awareness makes mocap unnecessary in many cases. The headset’s own awareness of where it is, along with its ability to recognize and track finger joints, fills in the previously missing pieces. The gear makes it much easier to translate the headset user’s body gestures and movements into the virtual world, allowing software developers to add new physical-digital interactions for amusement as well as practical purposes.

“Hand gesture recognition is extremely important for AR-based maintenance in automotive and aerospace engineering. You want the user to be able to get inside an assembly and find wiring harnesses, for example,” says Baron. The headset’s ability to map its physical surroundings is also critical because, “in automotive, sometimes you want to look at a whole different virtual front on an existing car.”

Those who want to develop AR-based design review may consider building physical rigs with easily recognizable surfaces where virtual objects can be slapped on. Thus, the physical setup provides the tangible sensation (weight, mass or texture) of the imaginary product, while the view in AR or VR delivers the visual layer.

Avoid Unnatural Interfaces

AR, as the acronym suggests, allows you to augment reality with digital objects. As Baron has learned from her time at Ford, unnatural user interfaces prove to be detrimental to such use cases. “Don’t give someone a game controller, and tell them, hit that button to do X, swipe left to do Y,” she advises. “Avoid user interfaces that don’t work the way people would naturally work in the real world.”

In maintenance and repair exercises, users need to not only learn the correct placements but also build muscle memory—something software application developers often forget. In the virtual world, you may be able to punch through a cluster of pixels to reach for a wire harness in a tight spot. A technician trained to install or repair something in this unrealistic setup is liable to fail when confronted with the laws of physics in reality.

“Often, what you need to do to prepare for AR is not just technical; it’s also cultural,” says Baron. XR can let a mechanical engineer show a designer why certain pillars and wiring harnesses need to be repositioned to avoid collision, but if the company doesn’t have a collaborative culture that encourages mechanical engineers and designers to work together, outfitting them each with a pair of $3,500 HoloLens 2 smart glasses won’t help.

More Epson America Coverage

More Lenovo Coverage

Making the Case for Engineering Workstation Upgrades
In this Making the Case whitepaper, Lenovo outlines how upgrading to the latest generation of professional workstations can provide a return on investment through increased engineering efficiency and greater flexibility.
Balancing Compute Resources
Understanding the nuances of various computing environments can aid in purchasing decisions.
Lenovo Shares AI-Related Collaboration
Lenovo will work with businesses to help them use data management capabilities.
Quality and Safety Drive Monitor Innovations
The latest monitors reduce eye strain while increasing visible detail.
Lenovo Supercharges Copilot+ PCs
With this latest debut, Lenovo says it intends to offer new levels of personalization in personal computing across its PC portfolio.
View a Live XR Broadcast in Apple Vision Pro?
XR, VR apps emerge to augment live events.
Lenovo Company Profile

More NVIDIA Coverage

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Kenneth Wong's avatar
Kenneth Wong

Kenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.

      Follow DE
#22934