The big data era demands more intelligent measurement systems

January 22nd, 2015, Published in Articles: EngineerIT


Elizabeth Dolman and Derrick Snyder, DAQ product managers for National Instruments.

Elizabeth Dolman and Derrick Snyder, DAQ product managers for National Instruments.

As modern machines, vehicles, and structures continue to grow more complex, engineers require more advanced sensors, measurement systems, and data management infrastructures to stay relevant.

For example, monitoring the Large Hadron Collider or a four-engine jumbo jet requires thousands of sensor channels that generate hundreds of terabytes of data. With a need for high channel counts, these advanced systems benefit from the latest sensor technologies with plug-and-play operation to decrease the setup time associated with manual configuration. To keep up with scientific innovations, engineers need smarter sensors and more intelligent, customisable measurement systems to support the massive amounts of data they collect and integrate with a larger IT ecosystem. These new trends and challenges in embedded measurements require engineers to think beyond the sensor to efficiently produce and distribute meaningful results.

Smarter sensors

For decades, engineers have been integrating traditional analog sensors into their measurement systems and therefore spending countless hours entering sensor configurations. With today’s advancements in technology, a larger number and higher mix of sensors are required to properly test the functionality or structural integrity of new designs. For example, in-vehicle testing today needs to validate not only the standard safety and drive performance but also automatic parallel parking, driver wakeup, blind spot detection, and even infotainment systems. This adds a new level of complexity with an increased number of sensors and new measurement types that must be integrated into one reliable, accurate system. As engineers face the challenges of testing advancing technologies, they are expected to do so with the same or less time and money. To minimise costs, they must reuse measurement hardware from one test article to the next. With such a large number of mixed measurements, configuring and maintaining these sensors is a major pain point that is costly and prone to human error.

To overcome these challenges, engineering companies are adopting smarter sensors that reduce configuration time while increasing reliability and accuracy. Smart sensors are based on the IEEE 1451.4 Transducer Electronic Data Sheet (TEDS) standard that defines how analogue sensors can inherit self-describing capabilities for simplified plug-and play operation. This standard outlines a mixed-mode interface that adds a low-cost serial digital link to access a TEDS embedded in the sensor. At a minimum, the manufacturer, model number, and serial number are included, though other important attributes, such as the measurement range, sensitivity, temperature coefficients, and calibration data, are often stored. This is basically everything engineers need to know to take measurements with sensors. By storing data sheets electronically, engineers ensure higher reliability systems through better sensor tracking and location identification. This means that human error in wiring and data entry no longer affects the overall integrity of the system because the sensor configuration is pulled directly from the sensor itself. In addition to increased reliability, TEDS systems offer higher accuracy since critical calibration data can be stored on the sensor. Gain and offset errors due to factors such as temperature drift and system age can be compensated for using the custom calibration figures stored on the sensor. TEDS sensors reduce the setup time associated with manual data entry, eliminate transcription errors that commonly occur during sensor configuration, and provide a more reliable, higher accuracy measurement system.

Contextual data mining from sensors

While TEDS sensors have been around for years, their relevancy is growing as tests get more complex with larger amounts of data. Data mining is the practice of using the contextual information saved along with data to search through and pare down large data sets into more manageable, applicable volumes. By storing raw data alongside its original context, or “metadata,” it becomes easier to accumulate, locate, and later manipulate and understand. For example, examine a series of seemingly random integers: 5126838937. At first glance, it is impossible to make sense of this raw information. However, when given context – (512) 683-8937 – the data is much easier to recognise and interpret as a phone number. Descriptive information about measurement data context provides the same benefits, and can detail anything from sensor type, location, manufacturer, or calibration date for a given measurement channel to revision, designer, or model number for an overall component under test. In fact, the more context that is stored with raw data, the more effectively that data can be traced throughout the design life cycle, searched for or located, and correlated with other measurements in the future by dedicated data post-processing software. Today’s intelligent measurement systems can automatically pull TEDS metadata from a compatible sensor and write it to fully searchable fields within measurement files. This makes it possible to mine larger measurement data sets to quickly find anything from files containing measurements with a particular sensor to files containing measurements from sensors that may need calibration.

More intelligent measurement systems

While sensors are getting smarter, embedded measurement systems are getting more intelligent to support these sensors and keep up with other industry demands. Test articles are growing more complex and yielding massive amounts of data, but engineers still need reliability and quick results. Though it is common to stream test data to a host PC over standard buses like USB and Ethernet, high-channel-count tests with fast sample rates can easily overload the communication bus. An alternative approach is to store data locally and transfer files for post-processing after a test is run, which increases the time to realise valuable results. To overcome these challenges, the latest measurement systems integrate leading technology from ARM, Intel, and Xilinx to offer increased performance and processing capabilities as well as high-end off-the-shelf storage components to provide high-throughput streaming to disk. With high-end onboard processors, the intelligence of measurement systems has become more decentralised by having processing elements closer to the sensor and the measurement itself. Modern data acquisition hardware, like the new stand-alone NI CompactDAQ system from National Instruments, includes high-performance multicore processors that can run acquisition software and processing-intensive analysis algorithms inline with the measurements. These intelligent measurement systems can analyse and deliver results more quickly without having to wait for large amounts of data to transfer. For long-term or high-speed applications, engineers may want to use the onboard intelligence to log data only under certain conditions, which optimises the system to use disk space more efficiently. Turnkey software tools are commonly used for simple applications, but for complete customisation to meet advanced requirements, engineers can use a text-based programming tool like Microsoft Visual Studio or a graphical programming approach like NI LabVIEW system design software.

Fig. 1: The stand-alone NI CompactDAQ system combined with NI LabVIEW software gives engineers the most flexible, intelligent, and accessible data-logging system available.

Fig. 1: The stand-alone NI CompactDAQ system combined with NI LabVIEW software gives engineers the most flexible, intelligent, and accessible data-logging system available.

With intelligence distributed at the measurement location, engineers can automatically run analysis routines to yield test results faster, but the enhancements don’t stop there. The proliferation of mobile devices has led people to expect immediate access to information wherever they are. Engineers are not immune to this “want it now” mindset. Mobile devices help engineers access information such as measurement data and test results more quickly and conveniently than ever before. One example is in-vehicle testing on a proving ground during which the driver needs to see live measurements to generate quick results, know that the test is operating properly, reduce mistakes, and save time. A mobile device mounted on the dash of a vehicle is a convenient and effective way to view data in real time. The next generation of measurement systems provides flexible and powerful software tools that can integrate with mobile devices. Several suppliers have started to offer some type of support for mobile device integration – often with very fixed functionality. National Instruments, a long-time leader in DAQ innovation, is driving the next generation of embedded measurement systems with the stand-alone NI CompactDAQ system combined with LabVIEW software. This combination offers engineers the most flexible, intelligent, and accessible data-logging system available. With LabVIEW, engineers have complete flexibility to remotely visualise, monitor, and interact with measurement data from anywhere and on any device to make informed decisions faster.

Fig. 2: Engineers can integrate a mobile device with stand-alone NI CompactDAQ to monitor embedded measurement systems from anywhere.

Fig. 2: Engineers can integrate a mobile device with stand-alone NI CompactDAQ to monitor embedded measurement systems from anywhere.

Intelligent measurement systems and the cloud

The unification of measurement hardware and onboard intelligence has enabled increasingly embedded and remote systems and, in some industries, has paved the way for entirely new applications. In data acquisition systems that make many measurements – particularly when the measurement systems are geographically distributed – several unique data storage, aggregation, transmission, and system management challenges must be met. Because distributed acquisition and analysis nodes are effectively computer systems that have software drivers and images and are often connected to several computer networks in parallel, the need arises for remote network-based systems management tools to automate their configurations, maintenance, and upgrades. Additionally, the volume of acquired measurement data (compounded by the proliferation of mobile devices and ubiquitous networks) is fueling a growing need in global companies to offer access to many more data consumers than in the past. This requires network gear and data management systems that can accommodate multiuser access, which in turn drives the need to geographically distribute the data and its access.

A popular approach to providing this distributed system management and data access is cloud technologies, which generally provide the following benefits:

Aggregation of data: If the distance between the elements of a system is measured in kilometers as opposed to millimeters, engineers may want to consider cloud data storage. For example, if an engineer is monitoring the condition of each gear box on a wind farm with hundreds of turbines, collecting data can become extremely costly and cumbersome. With cloud storage, such systems can store data in a common location so that engineers can easily collect, analyze, and compare it.

Access to data: In some cases, the embedded data acquisition or monitoring system is difficult to access physically. For example, if engineers are monitoring the health of a pipeline in a remote stretch of Alaska, they ideally would not need to send a technician to log the information and check the status of the system. If that data is being stored to the cloud, they can access it from anywhere, including connected PCs and mobile devices.

Offloading: The near infinite computing resources in the cloud provide an opportunity for software to offload computationally heavy tasks. These can be sophisticated image or signal processing or even compilation and development.

Cloud computing is a new generation of computing that uses distant servers to provide services and storage accessed over the Internet. The NI Technical Data Cloud (TDC) is a high-availability cloud-based service designed to give engineers and scientists the ability to securely consolidate, store, and share measurement data and analyzed results. TDC is a full-featured application housed in large, professionally administered third-party cloud data centres that are accessible from anywhere through RESTful or native LabVIEW APIs. By using LabVIEW, engineers can access data acquired with stand-alone NI CompactDAQ from anywhere in the world through mobile devices and cloud computing.

Contact Stephen Plumb, National Instruments, 011 805-8197,

Related Articles

  • SABS to host 42nd ISO General Assembly and ISO Week
  • Digitalisation brings auto manufacturer up to speed
  • Make your data actionable from edge to cloud
  • Research initiative aims to boost SA mining
  • Laser diodes enable next-gen technologies