Get Connected Blog | Nordic Semiconductor

AI and machine learning goes mainstream

Written by Kjetil Holstad | December 4, 2024

In my last blog on AI and Machine Learning (ML) I emphasized that AI and ML would transform the IoT, not in 10 or even five years from now, but on product roadmaps coming to market within the next couple of years. I’d like to ‘up’ the urgency on that prediction. AI and ML aren’t a couple of years away, they are here. Not a tool of the future, but a dynamic force driving innovation and change now.  

Any question about whether there is going to be a big role for AI and ‘Edge AI’—where AI and ML is performed locally rather than sending data up to the Cloud—in the IoT has been dispelled. 

The ML and TinyML difference for Edge AI

When we talk about AI in the IoT, I want to make it clear that, for the foreseeable future, we are really talking about ML and the stripped-down TinyML variant. 

ML adopts a model-based approach that relies on learning, trial and error from captured data. There’s a learning component (‘training’) that happens where there is an abundance of compute power like in the Cloud, and an on-the-fly predictive component (‘inferencing’) that in many cases can be so lightweight that it can run on small, embedded devices. 

As data is gathered, one can continuously re-do the training part and update the predictive part to get more accurate. As such, any attached sensor will continuously get better at monitoring what it is supposed to be monitoring. 

AI in the IoT also needs to be performed as close to the sensors as possible, all the way to the edge, on increasingly capable devices in terms of compute power. But devices that will still, by and large, be battery powered. And so much more resource-constrained than if they were plugged into the mains power grid.  

Three types of AI learning for the IoT 

We are also primarily talking about three types of IoT AI: supervised learning, reinforcement learning, and unsupervised learning. And the three main uses cases are ‘three Vs’ as Arm coined the term: vibration (anomaly detection and movement classification), voice (using audio recognition as a sensor), and vision (using image recognition as a sensor). 

The only difference being how much the ML model can learn on its own: running from very little (supervised) all the way through to autonomous learning (unsupervised) when given the right model to start with. Right now, ML can be run locally at the edge on IoT platform devices like Nordic’s latest nRF54 Series. You can optimize ML on a constrained device by using ML accelerators, achieving performance that can rival much more powerful devices and far outperform traditional MCUs. Once you have trained the basic model, which will typically take a day on a desktop PC, the inferencing side of things can accelerate hugely using AI-optimized algorithms. 

Nordic’s nRF54 Series was built for running optimized ML on constrained edge devices in just this way. It is around 25 percent more powerful than market-leading microprocessors and consumes 20 percent less energy than the lowest energy microprocessors in AI and ML edge computing applications. 

This means Nordic has enabled the advantages of edge computing, AI, and ML to be accessible to the broadest range of customers and applications. 

The edge computing, AI and ML advantage 

There are five primary advantages to customers. The first is latency. If you need real-time or near real-time responsiveness you must compute and take decisions locally. This also enables your AI and ML to perform quicker decision making.  

The next benefit is bandwidth. Edge computing reduces the reliance of having to maintain continuous network or Cloud connectivity that incur huge data overheads and data costs. 

The third benefit is privacy. Transmitting sensitive or personal data to the Cloud raises privacy concerns. Local edge processing and storage means better control over data confidentiality. This will help minimize the potential for security breaches. 

The fourth benefit of operating at the edge is cost. Cloud-based AI is significantly more expensive because it costs money to transmit a lot of data to and from the Cloud, and then pay for all that Cloud computing capability on top. Processing at the edge, in comparison, is free of such recurring operating costs. 

The fifth, and perhaps most important benefit, is energy efficiency. The world simply does not have enough resources for everything to be run in the Cloud. Smarter edge devices minimize power consumption, making them more environmentally friendly and beneficial for combating climate change.  

Ultra-low power connectivity leadership

No company in the world does ultra-low power connectivity like Nordic. And no SoC does energy-efficient edge AI computing like the Nordic nRF54 Series. The latest advances in AI and ML acceleration that Nordic’s Atlazo acquisition has brought to our internal R&D capabilities will only further extend our leadership in edge AI and ML. 

In my view, this development signifies a pivotal transformation for Nordic, elevating its identity far above being just a wireless connectivity or Bluetooth company within the IoT landscape, but also offering a seamless integration platform for AI in the IoT.