Latest News
February 1, 2017
Last week, San Francisco-based Rescale hosted its first-ever Rescale Night, an event organized around the arrival of the Era of Big Compute. The gathering held at Microsoft Yammer’s office drew Rescale investors, clients, and partners, including simulation software maker ANSYS, supersonic aircraft developer Boom, and Microsoft.
“Here in the Silicon Valley, we think everything is already in the cloud,” says Joris Poorte, Rescale’s founder and CEO. “But actually, in the enterprises, only about 6% of the IT workload is running in the cloud today. So we’re still at the very beginning of cloud computing.”
In his video report “Trends in—and the Future of—Infrastructure,”
If Joris’s hunch is right, the Era of Big Compute might accelerate cloud adoption to change those statistics.
As Joris sees it, we’re now in the third phase of cloud computing. In Phase I, or Cloud 1.0, companies like Salesforce, Netsuite, and Workday irrecoverably transformed enterprise software market with their SaaS (software as a service) products, delivered on-demand from a browser interface. In Phase II, or Cloud 2.0, companies like Cloudera, Palantir, and Splunk rode the wave of Big Data. In Phase III, or Cloud 3.0, he expects large computing jobs—from complex simulation of weather phenomenons and aircraft operations to scientific research—will benefit from private and public computing resources that have become inexpensive commodities.
“You might be thinking, ‘Hey, I can go to Amazon and sign up for a bunch of servers.’ Sure, you can do that. But those are commodity servers that haven’t been optimized and tuned to run specialized software and workloads,” Joris points out.
From Amazon Web Services, you can rent GPU-equipped virtual workstations (8 virtual CPUs, 1 GPU, 15GB RAM, 100GB storage) for $22 a month + $1.75 per hour. Less powerful virtual machines are available for as little as a flat fee of $25 a month. For large general-purpose computing jobs, you can deploy 64 CPUs with 256GB memory on Amazon EC2 for around $3.5 per hour.
On the other hand, vendors like Rescale offer remote hardware tuned and optimized to run special software, such as simulation programs from ANSYS or Siemens PLM Software. The critical component Rescale brings to the table, Joris says, “is the specialization of architecture.”
Small and midsize firms that do not want the burden of acquiring and managing their own computing clusters but still need to run compute-intensive specialized software (for engineering simulation, gene sequencing, or AI development, for example) may secure on-demand HPC from a wide variety of vendors. But vendors who provide tailor-made HPC remove a layer of complexity in implementation and maintenance in the long run.
“If I were a startup today, I’m not going to build an IT center,” says Ray Milhem, ANSYS’s VP of enterprise solution and cloud. “I’ll just built the business to run in the cloud.”
Most specialized software vendors aren’t equipped to offer on-demand HPC. Therefore, in the coming years, you can expect to see more partnerships between vendors who develop specialized software that can benefit from HPC and vendors who provide on-demand HPC augmented with tailor-made middleware and management tools.
On the same night, Rescale announced the launch of Bigcompute.org, described as a community of Big Compute, Big Data, and HPC experts. Founding partners include ANSYS, AWS, Siemens, Microsoft, and IBM.
“This is an open community anybody can join,” explains Joris. “Customers, software vendors, IT vendors, or those who are just interested in this space—everyone is welcome. We’re all there together. We’ll make the revolution happen in big compute.
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
Kenneth WongKenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.
Follow DE