Data gathering continued to get easier with later developments like U.S. firm National Instruments’ LabVIEW software launched in the mid 1980s. The software marked a significant change in data acquisition capabilities moving away from hardware and towards greater use of software, allowing users to easily compare and analyze data and manipulate graphical results. As a PC could run this software, it also reduced the size and cost associated with the previous data acquisition methods.
But as impressive as these early systems were, they were limited in the volume of data they could gather, collate and analyze. Today, the IoT has enabled the amount of data we collect to grow by many orders of magnitude. We can sense almost anything, anywhere, and then wirelessly transmit the data to powerful remote Cloud servers. This has opened up opportunities for new applications that previously would not have been possible.
Combining the data from multiple sensors provides greater insights and granularity than input from a single device. For example, in building monitoring applications, if the temperature rises yet the humidity decreases and the carbon dioxide content in the air increases, it could indicate a fire. Or suppose the sensors on rotating machinery detect more vibration, an increase in energy usage, and a decrease in air quality (due to lubricant fumes). In that case, it could be a good indication that a bearing will fail.
However, more sensors mean more data to deal with, and more data to send to a central server if the sensors are located remotely. That challenge is becoming ever more demanding as the IoT spreads its influence.
The sheer volume of data the IoT generates is both a blessing and a curse. That data contains vital information, but the hard part is finding that information in a seemingly endless stream of bits. Previously, everything would be sent to a cloud server for remote analysis, but engineers quickly realized that the IoT’s large volume of data would consume a lot of bandwidth and cost a lot of energy and money (particularly if throughput was metered).
The solution is to distribute the computing intelligence so that data can be analyzed locally and only passed over the wireless link if deemed important. Engineers call it “edge processing." Nordic’s nRF9160 SiP is the perfect example of this in action. Using its powerful Arm Cortex-M33 application processor and large memory allocation, the SiP can collect and store data from many sensors, sift and analyze for the key information, and then transmit those details across a kilometer-range cellular IoT link. Moreover, the nRF9160 is engineered for low-power operation to minimize energy costs and extend battery life.
Powerful units like the nRF9160’s Arm M33 processor enable machine learning (ML) models to be deployed on the edge device. ML will be key to tomorrow’s IoT because it will allow edge processing devices to continually improve their analytical performance through increasingly enhanced ML models. Nordic has collaborated with Edge Impulse, a U.S.-based ‘tinyML’ specialist, and offers ML reference samples in its nRF Connect SDK (Software Development Kit). These samples can be run on the nRF9160 - allowing developers to experiment with and test embedded ML models.
The nRF9160’s multimode modem with GNSS provides developers with a high level of edge processing flexibility. The device offers NB-IoT and LTE-M cellular connectivity. While both are supported by established cellular infrastructure each offers different advantages. NB-IoT provides greater range and penetration into buildings and is best suited for stationary applications such as smart meters. LTE-M offers up to ten times greater throughput and seamless cell handover, but with a shorter range, meaning it is better suited to moving devices, such as asset trackers or wearables.
Ashridge Engineering’s CharIoT data logger, for example, is designed for use in the water industry, and looks after leak detection, flow monitoring and pressure management. It uses the nRF9160 to collect data, which is then securely transmitted to a proprietary Cloud portal via a cellular IoT network. Thanks to the low power consumption enabled by the SiP, this product's battery life can last up to ten years. Ashridge noted the benefits of the nRF9160’s integrated form factor which includes the application processor and modem in the same package, unlike competitive modems where a separate processor is needed.
Similarly, TYMIQ’s Prylada IoT Gateway uses the nRF9160 SiP for applications including sensor monitoring, asset tracking and smart building automation. It collects telemetry data from a wide variety of sensors, such as temperature, humidity, light level, air quality, movement detection and more, then sends it to the Cloud using LTE-M connectivity. This device measures only 171 by 121 by 55 mm, so it can easily fit in a wide variety of locations.
The gateway is suitable for applications from smart cities to big data centers, and delivers the advantage of cost-effective, energy-efficient wireless data transmission, says Dzmitry Tsybulka, CEO of the company. Tsybulka said the solution brings value to any company needing a reliable wireless solution to remote asset monitoring.
And the SiP’s cellular IoT connectivity ensures the vital information distilled from that data can be reliably and securely sent anywhere in the world using established infrastructure.