NVIDIA GTC 2019: Data Science Workstation, Purchase of an Interconnect Vendor, Safety Force Field in Autonomous Driving, and More
There are sufficient demos and talks of ray tracing, but the biggest announcements have more to do with autonomous cars, machine learning and data centers.
March 25, 2019
Around noon on Monday March 18, NVIDIA GPU Technology Conference (GTC) attendees began boarding the long black buses that would shuttle them to the site of the keynote. The 15-min ride from the convention center took them to the San Jose State University's sunny campus. It was still more than an hour before the scheduled keynote, but the queue was already snaking through the grass fields, all the way to the student union building that houses the canteen.
“Hey, thanks for coming out this way! We got so packed in the old place we had to get you out here. I appreciate you making the trip,” said NVIDIA's cofounder and CEO Jensen Huang as he kicked off the event, with 9,000 registered attendees.
In the last decade, NVIDIA has also outgrown its roots in the graphics accelerator business. This year, there were sufficient demos and talks of raytracing (in the press briefing, Huang cheekily repeated the phrase “raytracing” dozens of times, lest anyone should accuse him of neglecting it), but the biggest announcements have more to do with autonomous cars, machine learning and data centers.
Part of the keynote was devoted to BMW's use of the Unity game engine and NVIDIA RTX for real-time ray-traced automotive interior simulation and the eyepopping graphics in the Dragon Hound game preview. These satisfied the graphics fans' visual hunger, but NVIDIA is now playing a much larger game. The new tagline of GTC, “The premiere AI conference,” clearly spells out what the GPU dragon is chasing.
The $6.9 Billion Bet
About two hours into the keynote, Huang began discussing the impact of distributed computing. “In the future, the way you design the network is going to change. Instead of a whole bunch of compute nodes connected by networking, the networking and computing will become one continuous fabric,” he said.
What followed was the announcement that NVIDIA was buying Mellanox, an interconnect solution providers, for $6.9 billion. Mellanox has been a long-time partner of NVIDIA. With this acquisition, NVIDIA gains the ability to engineer advantageous interconnects directly into its GPU-accelerated supercomputers and high-performance computing (HPC) clusters.
“Datacenters in the future will be architected as giant compute engines with tens of thousands of compute nodes, designed holistically with their interconnects for optimal performance,” the company explains in its press release. “With Mellanox, NVIDIA will optimize datacenter-scale workloads across the entire computing, networking and storage stack to achieve higher performance, greater utilization and lower operating cost for customers.”
A Workstation for the Data Scientist
Huang believes data scientists are about to become an integral part of engineering, product development and product lifecycle management. He also believes they need a tool that lets them work independently, without relying on the HPC queue for important workloads.
“The pipeline starts with data science. There's the stage of data ingestion, called data analytics ... The gigantic table of data could be anywhere from Gigabytes to Terabytes. As a spreadsheet, it wouldn't even load on a normal computer,” noted Huang.
A typical workstation may not even have sufficient memory footprint to accommodate a data scientist's data set, be it weather data or temperature data from edge devices. But perhaps a specially designed workstation might.
At the one hour mark in the keynote, Huang unveiled the NVIDIA Data Science Workstation, powered by Quadro RTX GPUs and CUDA-X AI, a collection of software acceleration libraries.
Different versions of the Data Science Workstation is expected to come from NVIDIA's partners, such as Dell, HP, Lenovo, and others. Within the bounds of what NVIDIA prescribes as a Data Science Workstation, partners have room to configure their own versions with different memory capacity, CPU choices, and throughputs.
Microway, one of the exhibitors at the show, demonstrated its version of the workstation, called the Data Science WhisperStation.
“Our spin on this workstation, is that we do this very quietly,” explained Brett Newman, Microway's VP or marketing. “We can fit in as many as four RTX GPUs, with each GPU pair connected by NVLink, up to a Terabyte of memory.”
With multiple processors, workstation noise and power consumption could become a serious issue. Microway's WhisperStations are known for being able to run with minimal noise.
Omniverse for Collaborative Content Creation
Forty-four minutes into the keynote, Haung discussed a common problem among filmmakers and 3D content creators. The modelers, the digital painters and the animators are all working on the same sequence, yet there's no digital environment that allows them to collaboratively work on the content at the same time. To fill the gap, NVIDIA wants to offer Omniverse.
“[Omniverse] is an open collaboration tool. It works with all major 3D tools,” said Huang.
For the purpose of the demo, Huang showed three individuals remotely collaborating in Omniverse—an Autodesk 3D Maya user, a Substance user and a Unreal game engine user, all working on the same scene.
“Notice that the Maya user is working on the plane's design, the Substance user is painting the design and the Unreal user is composing,” Huang pointed out. Without Omniverse, the workflow may be sequential, with one completing his or her potion and sending it off to another. However, with Omniverse, the workflow could be simultaneous, all in sync.
“With Omniverse, artists can see live updates made by other artists working in different applications. They can also see changes reflected in multiple tools at the same time. As a result, artists now have the flexibility to use the best tool for the task at hand,” writes NVIDIA in its blog post.
Safety Force Field in NVIDIA Drive; JETSON Nano for $99
As the keynote drew to a close, Huang announced that NVIDIA Drive, the AI-powered platform for autonomous vehicle development, is now available. The latest version comes with Safety Force Field, “a computationally verified simulated algorithm” that maintains a zone that prevents collision with nearby cars, Huang explained.
The algorithm predicts the movement of nearby cars, their trajectory, and their relative speeds to ensure the AI-driven car is within the Safety Force Field.
For the tinkerers and the DIY crowd, Huang announced the JETSON Nano, a $99 embedded computer for developing small, low-powered devices. It's a pocket member of the NVIDIA JETSON product family, embedded computers for powering autonomous machines.
“We're so proud of [the JETSON Nano]. It's the smallest computer our company has ever built,” said Huang. In good humor, Huang pulled one unit out of his pants' back pocket, then struck a pose to mimic the LEGO figure holding the JETSON Nano in the presentation backdrop.
The use of AI has already crept into average consumer tech, evident in iPhone's personal assistant Siri and Microsoft's Cortana, and in Google's autocompletion of user sentences. NVIDIA's betting it will become part of enterprise workflow, largely driven by the parallel-processing power of the GPU. The acquisition, new products, and featured speakers at the NVIDIA's GTC reinforces this belief.
More NVIDIA Coverage
For More Info
Subscribe to our FREE magazine,
FREE email newsletters or both!About the Author
Kenneth WongKenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.
Follow DE