What a Difference a Year Makes

By Steve Robbins

 


Steve Robbins
[email protected]

On my return recently from the SC08 show in Austin, I couldn’t help but think back to the early days. For me, they started in the early 1980s when I was working for one of the computer-publishing giants. Those were exciting times. The IBM PC was just launching, replacing mainframes, mini-computers, and the first business personal computers like the TRS-80, Commodore PET, and the Apple II; all were launched at the 1977 West Coast Computer Faire. These were also volatile times, but the rewards far outweighed the risks as users began to take control of their own data and move from the distributed computing universe to “personal computing.”

It was the software that made the revolution possible. Killer apps like WordStar and Lotus 123 moved control from a few empowered individuals to virtually anyone. When the Mac hit the scene, it included embedded software including MacWrite and MacPaint. During this time engineers were offered add-ons like I/O cards, A-to-D converters, data acquisition boards, and software. Soon networking became ubiquitous, and then the worldwide web changed forever the way engineers collaborated, communicated, designed, and manufactured products.

I’m amazed at how the personal computer became a commodity, and then the excitement faded. The very first technology trade show I attended was the 1983 COMDEX. The buzz was so high-pitched you could almost hear it flying into Las Vegas at 30,000 feet. The competition between IBM and Apple was fierce. Back then, the IBM PC ran at a blazing 4.77MHz and had 64K of RAM. Today you can store 4, 8, or 16GB of data on your key chain.

By 2003, COMDEX was over and the  only buzz between users was exchanged by gamers excited by their new purchases. Last year I attended Supercomputing in Reno and the buzz was back, but it surrounded data centers, financial applications, and molecular biology. Exciting stuff, but not in the DE wheelhouse. I spent three days talking up design engineering and analysis and, while people were listening, they just weren’t ready yet. The fastest clusters on the floor were around 40 teraFLOPS and a handful of vendors were partnering with analysis companies.

This year SC08 was astounding — the word petaFLOP was being bantered about as well as a 20G coast-to-coast network. The average cluster was running about 37 teraFLOPS. And, this time, I was doing most of the listening as vendors talked about how simulation was the next big thing for HPC. Visualization is in. It is the year of the GPU. We were floored by the acceptance of engineering apps — what a difference a year makes.

The excitement was contagious. I’ll bet the average cluster at this event had more than 1028 cores.

What does this mean for you as you’re asked to be innovative and solve problems in record time? Well, the tools are here with the convergence of HPC and simulation software, but we’re near a tipping point. Fasten your seat belts, things are changing rapidly. Forget running a complex simulation in four hours let alone four days. Real-time analysis and simulation is on the horizon and soon you will be running multiple scenarios and comparing the data in minutes. And, if you’re a small business, all you’ll need is a workstation and a server with a GPU. You’ll lease parallel-processing time from a service.

DE started 15 years ago when workstations became powerful enough to run 3D modeling software. Now, the software is taking advantage of advanced technology. What will happen in the next 15 years?


Steve Robbins is the CEO of Level 5 Communications and executive editor of DE. Send comments about this subject to [email protected].

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


#5955