The new generation of consumer electronics devices converge Internet connectivity, wireless communications, high-fidelity audio and HD video into a single device. To keep up with the times, different strategies have been adopted by test and measurement manufacturers and design houses. Take a look ..
Satish Thakare, head-R&D, VLSI division, Scientech Technologies, explains the traditional challenges that led to this trend: “Designers and manufacturers have to face a lot of challenges to make the product available in the market in a short time. Using a hardware-based approach does not serve the purpose as the designer has to redesign the hardware for every product. Even conventional methods will not serve the purpose as it works on the sequential method. So the designers need a kind of technology that allows them to change the functionality without changing the hardware while being able to upgrade the product on the go.”
Thakare goes on to explain the solution: “The obvious choice for the designer is to use reconfigurable hardware, i.e., FPGA. A benefit of using the FPGA in the instruments is that it offers high reliability, low latency, reconfigurability, high performance, embedded digital signal processor (DSP) core and true parallelism.”
Apart from digital functions, some FPGAs have analogue features. Some mixed-signal FPGAs may have integrated analogue-to-digital converters and digital-to-analogue converters.
Mahendra Pratap Singh, business development manager, TTL Technologies, adds, “Logic blocks can be configured to perform complex combinational functions and also include memory elements, which can be simple flip-flops or more complete blocks of memory. The architectural flexibility, customisation flexibility and cost advantage put FPGAs ahead of complementary technologies.”
The most common test instrument in the industry with this capability is the digitiser, which allows faster processing of digitised data.
As new wireless standards like the WLAN 802.11ac, WiMAX, LTE and high-throughput 802.11ad roll out, it becomes even more challenging for test engineers in India and around the globe. Bharti Airtel has already launched its 4G service in Kolkata, making India one of the first countries in the world to commercially deploy this cutting-edge wireless technology. As RF and wireless applications expand to become general-purpose, the instrumentation segment might also begin to mirror this trend with the adoption of RF instrumentation to such a level that it becomes as important as our digital multimeters.
A common problem that test engineers face with the explosion of different standards is that they have to continuously set up different test platforms for each standard.
Sadaf Arif Siddiqui, technical marketing specialist at Agilent Technologies, provides more insight: “A test engineer working on fast emerging standards may have to bear the pain of setting up different test instruments and different test platforms or software. Moving to an easy-to-use, upgradeable and multi-standard vector signal analysis software and instruments such as X-series analysers will reduce this pain and test times to a large extent, thereby optimising the test time and costs.”
1. FPGA-enabled instrumentation
With the increase in system-level tools for field-programmable gate arrays (FPGAs) over the last few years, an increasing number of manufacturers are including FPGAs in instrumentation. What’s more, engineers are given the choice to reprogram these FPGAs according to their requirements. So test engineers can embed a custom algorithm into the device to perform in-line processing inside the FPGA, or even emulate part of the system that requires a real-time response.Satish Thakare, head-R&D, VLSI division, Scientech Technologies, explains the traditional challenges that led to this trend: “Designers and manufacturers have to face a lot of challenges to make the product available in the market in a short time. Using a hardware-based approach does not serve the purpose as the designer has to redesign the hardware for every product. Even conventional methods will not serve the purpose as it works on the sequential method. So the designers need a kind of technology that allows them to change the functionality without changing the hardware while being able to upgrade the product on the go.”
Thakare goes on to explain the solution: “The obvious choice for the designer is to use reconfigurable hardware, i.e., FPGA. A benefit of using the FPGA in the instruments is that it offers high reliability, low latency, reconfigurability, high performance, embedded digital signal processor (DSP) core and true parallelism.”
Apart from digital functions, some FPGAs have analogue features. Some mixed-signal FPGAs may have integrated analogue-to-digital converters and digital-to-analogue converters.
Mahendra Pratap Singh, business development manager, TTL Technologies, adds, “Logic blocks can be configured to perform complex combinational functions and also include memory elements, which can be simple flip-flops or more complete blocks of memory. The architectural flexibility, customisation flexibility and cost advantage put FPGAs ahead of complementary technologies.”
The most common test instrument in the industry with this capability is the digitiser, which allows faster processing of digitised data.
2. Wireless standards outbreak
As new wireless standards like the WLAN 802.11ac, WiMAX, LTE and high-throughput 802.11ad roll out, it becomes even more challenging for test engineers in India and around the globe. Bharti Airtel has already launched its 4G service in Kolkata, making India one of the first countries in the world to commercially deploy this cutting-edge wireless technology. As RF and wireless applications expand to become general-purpose, the instrumentation segment might also begin to mirror this trend with the adoption of RF instrumentation to such a level that it becomes as important as our digital multimeters.
A common problem that test engineers face with the explosion of different standards is that they have to continuously set up different test platforms for each standard.
Sadaf Arif Siddiqui, technical marketing specialist at Agilent Technologies, provides more insight: “A test engineer working on fast emerging standards may have to bear the pain of setting up different test instruments and different test platforms or software. Moving to an easy-to-use, upgradeable and multi-standard vector signal analysis software and instruments such as X-series analysers will reduce this pain and test times to a large extent, thereby optimising the test time and costs.”
3. Increased use of wireless devices at the workplace
Tablet computers and smartphones have become so popular that they have a significant presence at the workplace too—not as devices under test but as part of the test system. While the computing power made available by these devices is notable, they cannot replace the PC and related measurement platforms like PXi. Instead, these devices are suitable for data consumption and report viewing, and system monitoring and control.
National Instruments’ Automated Test Outlook 2012 explains: “The explosion of mobile devices like tablets and smartphones provides compelling benefits to engineers, technicians and managers involved in automated test who need remote access to test status information and results. While today’s technology offers solutions for monitoring or remote reporting via mobile devices, test organisations will need new expertise to unite the networking, Web services and mobile app portions of the solution.”
National Instruments’ Automated Test Outlook 2012 explains: “The explosion of mobile devices like tablets and smartphones provides compelling benefits to engineers, technicians and managers involved in automated test who need remote access to test status information and results. While today’s technology offers solutions for monitoring or remote reporting via mobile devices, test organisations will need new expertise to unite the networking, Web services and mobile app portions of the solution.”
4. Software-defined instrumentation
As the complexity of products continues to increase, their testing becomes much more challenging. Test engineers now require test systems that are flexible enough to support the wide variety of tests that must be performed on a single product while being scalable enough to encompass a larger number of tests as new functionality continues to be added.
“Increasingly, the functionality of complex devices is being defined by the software embedded in them. This is challenging for many test engineers because most standalone instruments cannot change their functionality as fast as changes in the device under test (DUT) due to the fixed user interface and firmware that must be developed and embedded in the instrument. Thus test engineers are turning to a software-defined approach to instrumentation, so that they can quickly customise their equipment to meet specific application needs and integrate testing directly into the design process,” says Eric Starkloff, director of NI Test Product Marketing.
Thakare shares two major advantages of software-defined instrumentation: “First, it can dramatically reduce the number of hardware components in all the mixed-signal designs, which means smaller chip size for system-on-chip implementation. Second, it can provide automatic adjustment or compensation for circuit component variations due to temperature dependence, ageing and manufacturing tolerances.”
“Increasingly, the functionality of complex devices is being defined by the software embedded in them. This is challenging for many test engineers because most standalone instruments cannot change their functionality as fast as changes in the device under test (DUT) due to the fixed user interface and firmware that must be developed and embedded in the instrument. Thus test engineers are turning to a software-defined approach to instrumentation, so that they can quickly customise their equipment to meet specific application needs and integrate testing directly into the design process,” says Eric Starkloff, director of NI Test Product Marketing.
Thakare shares two major advantages of software-defined instrumentation: “First, it can dramatically reduce the number of hardware components in all the mixed-signal designs, which means smaller chip size for system-on-chip implementation. Second, it can provide automatic adjustment or compensation for circuit component variations due to temperature dependence, ageing and manufacturing tolerances.”
Software-defined instrumentation looks to become an essential component of scalable and highly performing test systems. Singh agrees by saying, “We predict a bright future for software-defined instrumentation. Software-defined instruments, also known as virtual instruments, are modular hardware with user-defined software giving the flexibility to combine standard and user-defined measurements with custom data processing using common hardware components. This flexibility is useful for electronic devices like advanced navigation systems and communication devices like smartphones to integrate diverse capabilities and adopt new communication standards.”
5. Use of multicore and parallel test systems
As the complexity and functionality of electronic devices grow exponentially (in sync with Moore’s law), so does the cost of testing them. Minimising the cost of test can be challenging, but one way is to test more with less. The inherent parallelism that is made available by the graphical programming paradigm of software like LabVIEW from National Instruments and FlowStone DSP from DSP Robotics helps engineers immediately benefit from multicore processors and overcome the complexity associated with traditional text-based languages.
The trend of increasing clock speed to get better performance ended back in the early 2000s. Since then, processor manufacturers have implemented alternate technologies to ramp up performance while keeping the clock speeds around 3 GHz. These technologies include the use of processors with multiple cores on a single chip, hyperthreading, wider buses and hyper transport. Moreover, the advancement of the process node to the current 22nm process by utilising 3D transistors has resulted in significantly faster, leaner and more efficient processors for use in embedded controllers and modular instrumentation.
Denver D’Souza, senior technical consultant at National Instruments India, says, “The reality that transistor density doubles every 18 months has led to significant advances in the performance of electronic devices. This is evident not only in the latest Intel Core i7 processors but in the shrinking of technology such as 64GB solidstate drives, which are now the size of a postage stamp. These technological advances translate into considerable cost reductions.”
The trend of increasing clock speed to get better performance ended back in the early 2000s. Since then, processor manufacturers have implemented alternate technologies to ramp up performance while keeping the clock speeds around 3 GHz. These technologies include the use of processors with multiple cores on a single chip, hyperthreading, wider buses and hyper transport. Moreover, the advancement of the process node to the current 22nm process by utilising 3D transistors has resulted in significantly faster, leaner and more efficient processors for use in embedded controllers and modular instrumentation.
Denver D’Souza, senior technical consultant at National Instruments India, says, “The reality that transistor density doubles every 18 months has led to significant advances in the performance of electronic devices. This is evident not only in the latest Intel Core i7 processors but in the shrinking of technology such as 64GB solidstate drives, which are now the size of a postage stamp. These technological advances translate into considerable cost reductions.”
6. Merging of EDA tools and hardware test platforms
The extremely competitive environment in which electronics companies work now is shown by how next-generation communication protocols are barely labeled as standards before they can be seen in the market. For instance, the 802.11ac solutions have already been brought out by Broadcom even though it is yet to be ratified. In situations like these, companies go all out to get a jumpstart on the competition, and what better way to do this than to merge design and testing in order to accelerate the ‘time to market’.
Adesh Jain, applications consultant at Agilent Technologies, explains why the traditional method is slow: “Traditionally, for any complete electronic product to be ready for the market, each component of the complete system is first designed and verified with EDA tools, then prototypes are fabricated and tested, before the final product is released to the market. If discrepancies are found in the hardware at later stages, the whole cycle has to be repeated, which would result in loss of time as well as money for any organisation.”
Proper verification at earlier stages reduces this time and effort to a great extent. The tests, specs, algorithms and plots used in the early stages of EDA are the same as measured on the test bench. The aim is to merge both the worlds and see if it is possible to save the design engineers’ time by streamlining the flow and thus improve productivity while reducing the time to get the product out to the market.
Adesh Jain, applications consultant at Agilent Technologies, explains why the traditional method is slow: “Traditionally, for any complete electronic product to be ready for the market, each component of the complete system is first designed and verified with EDA tools, then prototypes are fabricated and tested, before the final product is released to the market. If discrepancies are found in the hardware at later stages, the whole cycle has to be repeated, which would result in loss of time as well as money for any organisation.”
Proper verification at earlier stages reduces this time and effort to a great extent. The tests, specs, algorithms and plots used in the early stages of EDA are the same as measured on the test bench. The aim is to merge both the worlds and see if it is possible to save the design engineers’ time by streamlining the flow and thus improve productivity while reducing the time to get the product out to the market.