Latest News
March 1, 2006
By Kristi Hobbs
An evolution is underway in the instrumentation used by engineers to test their inventions. It has reduced the need for specialized instrument gurus or test experts, and has given the design engineer, scientist, or technician the ability to quickly develop automated tests and run them. Plus, the increasing simplicity of developing automated measurements has given even beginning users the same benefits of gathering more reliable data from advanced research to develop higher quality products. Instrumentation Beginnings
Eons ago—just prior to the 1970s—engineers typically relied on traditional “box” instruments to take measurements for testing prototypes or production hardware designs. Each box had its own display, function (digital multimeter, waveform generator, logic analyzer, etc.), and a knob-and-button interface. Most of these traditional instruments had no mechanism for storing or recording the data they measured, so the test engineer had to record data manually. While this interactive method of testing worked well for simple, nonrepetitive tests with small data sets, it became more cumbersome with a complex test plan that included many steps, many instruments, or many iterations. Because the instruments could not automatically record or analyze data, manual recording introduced human errors into the data.
The evolution of instrumentation over the course of three decades. |
Sameh Kamel, who frequently conducts RF tests for Repeater Technologies of Sunnyvale, CA, on its wireless repeaters, cited human error as a major issue. “RF tests are traditionally performed with instruments such as network analyzers, spectrum analyzers, signal generators, power meters, noise figure meters,” he says, “and results are manually recorded on a data sheet. However, the limitations of manual measurement, such as subjective interpretations by each test technician and lack of access to past test results over time, pose a risk when testing a new product.”
But there was an upside: These traditional instrumentation shortcomings led to the development of virtual instrumentation (VI). The Birth of Virtual Instrumentation
Virtual instrumentation was introduced in the 1970s, and through the early 1980s it offered a way to connect traditional instruments to PCs, which were becoming increasingly popular. Virtual instrumentation meant that engineers could use software and hardware (typically a plug-in card) for their PCs to connect their instruments on a common bus (defined by the IEEE 488 standard, now known as General Purpose Interface Bus or GPIB). This method benefited traditional instrument users by providing automated data storage and analysis, thereby eliminating human error. It also saved time and effort during repeated tests, since engineers could simply rerun the software program rather than go through the manual steps each time.
Soliton Automation of Bangalore, India, was able to reduce cycle time on a customer’s power supply tests from one hour with a manual system to 10 minutes by implementing an automated system based on VI.
“The customer now saves valuable time because the system provides completely automated data collection and report generation by replacing the manual processes of the outdated test system,” says Anand Chinnaswamy of Soliton.
Initially, engineers still had to purchase a separate box for each function they wanted to perform, which was costly and took up a large amount of space. Companies like National Instruments soon realized that these boxes had many parts in common with the PCs they were connected to, including display, memory, and processor. Consequently, they developed general-purpose programmable I/O boards that plugged into a PC (now known as data acquisition or DA boards and modular instruments).
›› Features like Express VIs in NI LabVIEW and new software packages like NI SignalExpress have made virtual instrumentation development simpler for beginners and automation experts alike.
The emergence of plug-in data acquisition hardware meant that engineers could now automate their systems in a more cost- and space-effective manner by using this hardware to take measurements and then perform custom analysis on their PCs. Software development tools helped these engineers create “virtual instruments” by offering a method to program their hardware, perform analyses, store the data, and present it on the PC screen. Engineers who previously had numerous instruments and unwieldy test setups could pare down their equipment and simplify complex tests through automation. While this helped greatly with complex tests and analyses, engineers who completed quick, simple tests still hesitated to adopt VI because of the time it took to program hardware.
VI TodayIn the last six years, VI has continued to evolve and build on its foundation of flexibility and cost-effectiveness. Manufacturers have made many improvements to help those performing simple tests build custom instruments quickly. Hardware installation is now as simple as plugging in a USB cable, and prices have dropped: Programmable measurement hardware now starts at less than $100. The affordable price and simple setup is removing the barrier for those wanting to test drive automation before making any significant investment. Users once priced out of the market now consider automation a possibility.
VI software now offers improved features to make common tasks easier for new and advanced users alike. For example, National Instruments LabVIEW software includes Express VIs, which are functions that bring up a configuration window to walk the user through tasks such as taking a measurement, analyzing frequency data using a fast Fourier transform (FFT), or generating a report. NI LabVIEW also includes new structures such as the shared variable that engineers creating complex distributed measurement and control systems can use to more easily share data among the nodes.
A VI running in SignalExpress. . |
The software has also evolved to produce tools for non-programmers as well. Data acquisition hardware kits often include ready-to-run data logger software so users can start taking basic measurements quickly without programming. In addition, software tools like NI SignalExpress have recently emerged that are geared toward interactive users in test and design. The Future of Virtual Instrumentation
This instrumentation evolution has resulted in several benefits. Because users spend less time developing their tests, they can now spend that time running additional tests and analyzing data. With an automated documentation process, engineers and scientists can run longer tests, test more iterations, and gather more data with less error, all of which results in higher quality products and research.
After more than 20 years of change, the evolution of virtual instrumentation may appear complete. But, in fact, it continues to keep pace with advances in computing and communication technology and to innovate in the areas of usability and scalability. And virtual instrumentation will continue to evolve as long as engineers and scientists are asked to solve new design, test, and control problems.
Kristi Hobbs is a Data Acquisition product manager at National Instruments where she has specialized in portable and USB-based measurement hardware for more than four years. She earned a Bachelor of Science degree from the University of Texas at Austin. You can send an e-mail about this article by clicking here. Please reference “Instrumentation, March 2006” in your message.
Beginner’s Luck
The evolution toward a simpler interface for automating measurements via virtual instrumentation (VI) is greatly benefiting those who are less familiar with the technology. A National Instruments survey revealed that more than a third of design and test engineers classified themselves as novice users (see chart below).
›› More than 1/3 of the respondents to a National Instruments’s survey classified themselves as a novice measurement system developer.
Not surprisingly, each successive generation of VI software and hardware includes features designed to make it easier for this group of users to automate their measurements. Because the development of automated measurements is opening up to new users, the structure of the engineering team is changing. The software guru may spend as much time programming as she did before, but rather than working on routine tasks, she now can focus on more complex areas where her experience adds value. If a simple program can’t just be reused from another project, an inexperienced engineer or technician can develop one to fill the space where previously the task appeared too daunting.—KH
Contact Information
National Instruments
Austin, TX
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
DE EditorsDE’s editors contribute news and new product announcements to Digital Engineering.
Press releases may be sent to them via [email protected].