Latest News
September 23, 2010
I used to think of a rendering session as a break. With my CPU all tied up in calculating ray bounces, my machine crawled to a turtle’s pace. That’s when I stepped away to get coffee at Starbucks, grab a sandwich, or read a book. Some rendering jobs took so long I could literally finish a chapter from Tolstoy’s War and Peace. But what I witnessed earlier this week at NVIDIA GPU Techology Conference suggested my involuntary breaks—and Tolstoy—would have to wait. Because I might no longer have to wait too long to render my scenes.
Nearly real-time updates for scene rendering in 3ds Max is not a fantasy. It’s about to hit us like a piece of pie.
On Tuesday (September 21, 2010), at NVIDIA GPU Technology Conference, Ken Pimentel, director of visual communication solutions, media and entertainment, Autodesk; and Michael Kaplan, VP of strategic development, mental images, joined NVIDIA CEO Jen-Hsun Huang to demonstrate a new feature in 3ds Max, a leading content creation package from Autodesk.
Beginning this fall, 3ds Max users on subscription will get an update that allows them to use mental images’ GPU-accelerated iray rendering engine to visualize their scenes. (mental images is a wholly owned subsidiary of NVIDIA.) In the announcement jointly issued by Autodesk and NVIDIA, the companies said, “While iray produces identical images on either CPUs or GPUs, 3ds Max users will enjoy up to 6X faster results over dual quad-core CPUs when using a GPU such as the new NVIDIA Quadro 5000 or Tesla C2050.”
“Because rendering systems use interpolation to accelerate things, they have this command panel that looks like you’re flying a 747. There are just so many dials and settings,” jested Pimentel. “What’s great about iray is that it introduces another level of usability—it’s point and shoot.”
iray, according to mental images, is “the world’s first interactive and physically correct, photorealistic rendering solution.” It takes advantage of the GPU to accelerate computation, thus often producing the results faster than CPU alone could. Since iray calculates light bounces according to physical laws, it requires no interpolation (or “fake reality,” as some might call it). Thus, the input interface for rendering on iray is also comparatively simpler, Pimentel observed.
Pimentel, Kaplan, and Huang also demonstrated a browser-based solution, which lets presenters use a standard laptop to access a web-hosted 3ds Max scene (hosted by PEER1) from a web browser and render it with nearly real-time speed as they walked around the scene. (The only data a user sent was its virtual position, as represented by the mouse pointer. All rendering took place in the cloud, on 32 Fermi-class GPUs hosted remotely, explained Kaplan.)
“[Cloud-hosted GPUs] are all running exactly the same iray software that comes with 3ds Max,” said Kaplan. “We can guarantee that the image that you get from [cloud-hosted iray renderer] is exactly the same, pixel for pixel, as what you would get from 3ds Max.”
Previously, rendering and animation was not intended to imitate reality. For the most part, they represented an artist’s or an animator’s representation of reality. Certain details in the result, be it a still image or an animation sequence, were often approximated or simplified because computing resources available weren’t sufficient to reproduce every photon’s path. The rise of multi-core CPUs brought some relief, but NVIDIA hopes more will turn to GPU cores to accomplish the same task much faster.
To watch Jen-Hsun Huang’s opening keynote, visit this link.
For more, read a report from GTC on the emergence of GPU-based high-performance computing.
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
Kenneth WongKenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.
Follow DE