Hyperconvergence Promises HPC Democratization
IT component centralization brings promise for AI and deep learning applications.
Latest News
November 1, 2018
In the race to Industry 4.0, more compute power is never enough. Improvements to engineering IT always pay back with more simulations, smoother design workflows, better visualizations and faster time to market. New productivity gains are coming from a melding of resources that, until now, operated separately. IT vendors call this new movement hyperconverged infrastructure (HCI). The goal is to centralize and democratize multiple IT components into one system that runs on standard, off-the-shelf servers.
These high-density systems are the next step in high-performance computing (HPC), and they come with the bonus of providing a standardized platform for new artificial intelligence (AI) and deep learning (DL) applications. Hyperconvergence takes virtualization to another level to increase resource use. When essential services move from custom hardware products to commodity servers, application latency is reduced and the ability to run large data sets increases.
“While emerging technologies, such as artificial intelligence, IoT and software-defined storage (SDS) and networking offer competitive benefits, their workloads can be difficult to predict and pose new challenges for IT departments,” says Ashley Gorakhpurwalla, president and general manager, Dell EMC Server and Infrastructure Systems. The modular approach of hyperconverged systems offers a path to “flexibly build and combine compute, storage and networking, so organizations can transform their IT in a way that optimizes resources and offers investment protection for future generations of technological advances,” he continues.
Many data centers and HPC systems still use disk-based solutions, but the need for speed means flash storage is becoming more popular. All-flash storage is now considered enterprise ready and can be specified for new hyperconverged HPC systems. Flash storage’s low latency means removal of latency-based workflow bottlenecks and subsequent increases in efficiency.
Another advantage of HCI is the ability to use in-memory computing as a standard IT resource. In-memory computing stores information in RAM, instead of specific databases. Using search algorithms, data is analyzed and presented as needed. SAP developed one of the first appliances for in-memory computing: the high-speed analytical appliance (HANA). SAP claims it offers performance 10,000 times faster than standard disks used by databases. Vendors working on HCI solutions are adding in-memory computing to the mix. One expert, George Anadiotis of business consulting firm Linked Data Orchestration, says in-memory computing will do to spare compute power and memory what Hadoop did to spare commodity hardware.
Software-Defined Storage
A key technology of hyperconvergence is SDS. Storage resources and functionality are separated from dependencies on specific physical storage devices. The Storage Networking Industry Association defines SDS as a system that offers automation, standard interfaces, virtualized data paths, scalability and transparency in storage utilization. This provides more flexibility for the IT ecosystem to use resources efficiently.
SDS differs from earlier technologies like network attached storage (NAS) and storage area networks (SAN) by use of storage virtualization, allowing use of all available resources for storage pooling and automated storage management. NAS and SAN solutions often revolve around proprietary hardware; SDS installations use standard x86 storage hardware. Storage deployment becomes a software application providing unified availability, provisioning and scaling as required.
SDS deployments are the bridge to the next generation of product development platforms that will require real-time scalability and storage provisioning on demand. As development systems and intelligent products mature, edge and fog computing resources will become as important as cloud and servers to the HPC environment.
Research Institutions and Hyperconvergence
Academic and scientific researchers are taking a close look at IT hyperconvergence. “Conventional approaches to IT cannot accommodate the business agility” required today, noted S.A. Azeem and S.K. Sharma in the International Journal of Advanced Research in Computer Science. “Converged systems are rapidly gaining acceptance as a way to improve overall business agility and the productivity of IT staff.”
Azeem and Sharma note six benefits of moving to a HCI:
1. The use of common data center products can be managed through a single interface;
2. software-defined storage can be easily integrated;
3. hardware standardization;
4. increased resource use;
5. increased use of virtual desktops; and
6. fewer issues with user-caused problems such as boot storms, and allowance for more services such as virus scanning to run as background services.
First-ever Magic Quadrant
Gaining the attention of research firm Gartner Inc. is a rite of passage for new technologies. HCI got its own Magic Quadrant for the first time this year. Gartner predicts that by 2020, 20% of business-critical applications currently deployed on traditional three-tier platforms will transition to hyperconvergence.
Vendors who made the initial Magic Quadrant list had to show a minimum of 50 production hyperconvergence customers and had to have at least $10 million in bookings. Twelve vendors made the inaugural list:
Leaders: Nutanix, Dell EMC, VMware, HP Enterprise
Challengers: Cisco, Huawei, Pivot3
Visionary: Stratoscale, Microsoft
Niche Players: Scale Computing, DataCore, HTBase
As this is the first list of hyperconvergence companies from the research firm, we can expect places to shift as the technology and the market find their footing. However, the benefits of hyperconvergence in today’s IT environment mean the concept is poised to take on increasing importance.
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
Randall NewtonRandall S. Newton is principal analyst at Consilia Vektor, covering engineering technology. He has been part of the computer graphics industry in a variety of roles since 1985.
Follow DE