QE in the Metaverse
New Deloitte report points to increased simulation and virtualization of product testing
Latest News
August 24, 2023
Earlier this year, Deloitte released its 2023 Quality Engineering Trends report, covering key factors the organization says are necessary to “adapt to familiar QE challenges in unfamiliar times like spend, innovation, operating in a post-pandemic society, and leveraging new growth opportunities.”
Digital Engineering spoke to Rohit Pereira, principal and quality engineering practice leader, Deloitte Consulting LLP, about some of the findings and trends in QE.
DE: How has the role of QE in the product development/design/manufacture process changed, particularly over the years of the pandemic and moving forward?
Rohit Pereira: While quality continues to play a pivotal role in product delivery, the pandemic accelerated the growth of digital, cloud, and modernization-related transformations. This required a shift in operating models, as well as the need for faster deployment to the market. For instance, our QE team helped clients rapidly move to a virtual engagement model with clients with high quality of delivery by leveraging:
- AI-related test automation at scale throughout the testing delivery life cycle from test management to test design and test execution.
- Shift Left in delivery by exercising code to identify design and performance bottlenecks earlier in the delivery lifecycle.
- Shift Right leveraging Chaos Engineering to proactively identify production issues before they occurred.
Quality engineering has become a value driver enabling organizations to reduce cost of delivery and to identify new innovations that can be applied to technology delivery, as well as business operations.
How do you explain the drop in hardware sales for QE? How is the need for hardware in these scenarios being reduced – what is replacing the need to use those tools?
Rohit Pereira: Transitioning to cloud-native/cloud-ready solutions is eliminating much of the need to set up dedicated test environments, thus contributing to the reduction of hardware spend. Hardware and tools are no longer potential bottlenecks for testing as with cloud adoption; test environments can be created on demand to support automation within the testing life cycle and simulation of production-like scenarios.
How extensively are companies actually using metaverse/virtual platforms for quality engineering?
Rohit Pereira: With the shift in the digital landscape, immersive customer experiences are becoming increasingly important across a variety of devices and platforms to drive product personalization. Though automation tools and technology for testing metaverse/virtual platforms are currently primitive and expensive, we see increased adoption of AI and Generative AI-based test solutions in the industry. These AI-based predictive modeling solutions are not only leveraged to interact with digital avatars on metaverse/virtual platforms, but also to capture the user experience at each step of the customer journey to identify areas of improvement and build products and service offerings focused on customer needs. While several elements of the metaverse are all around us, there is still work needed to reach its full potential.
What will need to happen within industries to make metaverse/virtual testing as acceptable as physical test (or at least an acceptable replacement for some phases of testing)?
Rohit Pereira: The metaverse can engage a variety of users across the global ecosystem and drive crowd testing at scale. To truly achieve this, though, more users will need to join the metaverse and move it mainstream. While broad adoption for the metaverse is a work in progress, the core challenge for testing still hinges on the ability to harness the right tools and skills for both testing the metaverse, as well as leveraging the metaverse for testing.
We at Deloitte believe, as the metaverse framework matures and its use cases evolve, there will be a widespread user adoption across industries—considering the consumer engagement opportunities it can bring to businesses. This can improve the overall reach and early feedback for new products and technologies introduced at a global scale and make virtual testing more acceptable.
How is simulation being leveraged to improve quality engineering, and how is the role of simulation technology changing?
Rohit Pereira: With self-evolving testing solutions using AI, organizations now can simulate end-to-end customer journeys and use predictive test modeling to gain valuable insights into customer behavior and leverage it to customize products and services based on customer needs. Also, AI-based predictive test models are being used to simulate production-like scenarios, synthetic test data generation at scale that mimics production data, and identify the areas of impact to employ risk-based testing models.
Can you explain chaos engineering, and how that is useful from a quality perspective? Does it have a role in QE for physical products?
Rohit Pereira: Simply stated, chaos engineering aims to proactively identify the pressure points for applications under test so that they can be addressed ahead of time and do not break in production when it matters. For physical products, this becomes especially critical since they are subjected to various forms of stress through touch, environmental factors, and other external factors. Chaos engineering adopts a scientific approach to identifying, analyzing, and addressing potential breaking points by deliberately engineering vulnerabilities and failures. The goal is to ‘break the system’ proactively so it doesn’t break in production.
The paper also discussed synthetic data and test data management challenges – do users (or potential users) trust this type of data for testing/training?
Rohit Pereira: Synthetic data is leveraged at scale by many organizations to reduce risks of PHI/PII infringement in lower environments and to leverage datasets that more holistically mimic their production datasets. This enables more efficient and effective test execution at scale across teams with a reduction in data contention across teams operating in the same non-production environments. Additionally, this removes the need for test data de-identification while generating higher volumes of test data for transaction validation at scale.
At Deloitte, we have developed mechanisms through AI and ML-based algorithms that can analyze production data and use that information to generate meaningful synthetic test data that mimics production outcomes without impacting PII/PHI, increasing user trust and confidence.
How might the evolution in quality engineering affect other phases of product design – in other words, will initial product development/design see changes in order to better take advantage of these new capabilities on the quality side?
Rohit Pereira: Quality engineering is increasingly seen as an integrator (Environment, Release, DevOps -CI/CD etc.), as opposed to a distinct phase that follows build/development. This is true especially in the case of large transformation programs where Quality Engineering is integrated and enables innovation in business processes and improves confidence in delivery outcomes.
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
Brian AlbrightBrian Albright is the editorial director of Digital Engineering. Contact him at [email protected].
Follow DE