Standalone data acquisition system is a construct that has evolved quite bit in the last thirty plus years. This blog goes through its main changes over time and proposes an upcoming shift that all indications show will happen in the short term; one that favors low cost solutions.
Before we start our trip back in time, let’s first define what a data acquisition system is. Taking the definition of what a DAQ system is from the National Instruments website: “Data Acquisition is the process of measuring an electrical or physical phenomenon, such as voltage, current, temperature, pressure, or sound, with a computer. A DAQ system consists of sensors, DAQ measurement hardware and a computer with software.”
Using this definition; we can probably say that an instrument is also a data acquisition system; as it has some sort of sensor (probes), data acquisition hardware (the electronics inside the instrument) and a computer with software (the firmware that will process the acquired data). Using this last statement as a hook, let’s start our analysis by first visiting what I am calling here the era of the box instrument, thirty or so years ago. Standalone box instruments such as oscilloscopes, multi meters and logical analyzers dominated the benchtops of the vast majority of Test Engineers’ labs.
Then National instruments started to gain momentum with its concept of virtual instrumentation. Being a former NI’er myself back in the late 90s, I was extremely excited to ride the hip wave of “the Software is the Instrument” pushed by NI through that decade. Virtual Instrumentation is what I am calling the second era of the modern standalone data acquisition system movement. On this scenario, the typical box instrument started to yield to the novice concept of a standalone data acquisition system consisting of a personal computer plus a data acquisition card.
Time goes by and the increasing demands of data acquisition and control systems drove NI to come up with the compact RIO platform. The main actor of this platform is the FPGA and all the re-configurability power that it brings to a standalone data acquisition system. So much so that I am calling the third era, the Reconfigurable IO era of the standalone data acquisition system.
Now, while all this was brewing, a phenomenal concept, initially unrelated to standalone data acquisition systems, was organically growing; the open source movement. Initially, the open source movement was driven by the Linux operating system. Almost to a point that Linux and open source became almost synonyms back in the mid-2000s. However, the open source concept evolved to such a level that more recently it was also extended to hardware.
Some incredible open source microcontroller based platforms started to gain momentum. And later on, Linux based single board computers started to pop. At the time I am writing this blog (and possibly, literally, while I am writing it) some incredibly powerful single board computers are becoming available to the community. Just to cite one example, the Raspberry Pi 3 single board computer is a $35 quad-core ARM processor running Linux that can actually be used as a personal computer.
The Reign of the Low Cost Standalone Data Acquisition System is Coming
This last paragraph has set the stage for what I will call the fourth era of the standalone data acquisition system; the low cost era. One that takes advantage of the CPU horse power that is being made available at lower and lower cost these days.
Two main points need to be made obvious though. Firstly, today in test and measurement and the world of data acquisition, it is almost impossible to say standalone data acquisition system without saying LabVIEW. LabVIEW has indeed become the de facto programming environment for test and measurement applications. Therefore until LabVIEW is unlocked to run embedded on these lower cost single board computers, it would be futile to even start this discussion (wink, wink, hint, hint).
The second point to make clear is that, like everything else in the technical industry, there is no one size fits all solution. If your application requires the horse power and re-configurability of a FPGA, the platforms available from the third era above are applicable and what you will want to use. However, it is always important to highlight that, a great number of applications flat out don’t require all that power. Therefore, what I am proposing here is that a good chunk of standalone data acquisition system type of applications could indeed benefit from lower cost/lower horse power options.
This concept is what motivated the creation of a compiler that unlocks LabVIEW code to run embedded on lower cost platforms. Such compiler that first targeted the Arduino platform and later targeted Raspberry Pis is being leveraged in the creation of low cost hardware products, a non-real time and a real-time one, that can be used as a standalone data acquisition system platform that would fit this fourth era.
With the ever so high demand for lower cost products and solutions across all industries, both by consumers and companies, let’s stand on the shoulders of the giants that created and continue to create new low cost tools and transform the test and measurement industry; an industry that has become so pivotal for the overall progress of our modern society.