Tiny Machine Learning
Tiny Machine Learning
An Introduction to Tiny
Machine Learning
Learn about TinyML, its applications and benefits, and how you can get started with this
emerging field of machine learning.
Machine learning models play a prominent role in our daily lives – whether we know it or not.
Throughout the course of a typical day, the odds are that you will interact with some machine
learning model since they have permeated almost all the digital products we interact with; for
example, social media services, virtual personal assistance, search engines, and spam
filtering by your email hosting service.
Despite the many instances of machine learning in daily life, there are still several areas
the technology has failed to reach. The cause? Many machine learning models, especially
state-of-the-art (SOTA) architectures, require significant resources. This demand for high-
performance computing power has confined several machine learning applications to the
cloud – on-demand computer system resource provider.
One such algorithm used for these tasks is neural networks. Neural networks belong to a
subfield of machine learning known as deep learning, which consists of models that are
typically more expensive to train than machine learning models. You can learn more
about building neural network models in R in a separate tutorial.
TinyML’s growth in recent years has largely been attributed to the development of the
hardware and software ecosystems that support it. Since the techniques can be implemented
in low-energy systems (i.e., sensors, microcontrollers, etc.), machine learning can be brought
to the edge in an extreme way, enabling such applications to perform with real-time
responsivity. In essence, the idea is to enable machine learning practitioners to do more with
less.
But why is this so important? Let’s take a look at why TinyML is appealing.
2.Energy savings: Microcontrollers need a very small amount of power, which enables them to operate
for long periods without needing to be charged. On top of that, extensive server infrastructure is not
required as no information transfer occurs: the result is energy, resource, and cost savings.
3.Reduced bandwidth: Little to no internet connectivity is required for inference. There are on-device
sensors that capture data and process it on the device. This means there is no raw sensor data
constantly being delivered to the server.
4.Data privacy: Your data is not kept on servers because the model runs on the edge. No transfer of
information to servers increases the guarantee of data privacy.
Computer vision, visual wake words, keyword spotters, predictive maintenance, gesture
recognition, industrial machine maintenance, etc., are all common TinyML use cases. Let’s
also take a look at some industries where TinyML has been used to power applications:
Agriculture
Real-time agriculture and livestock data can be monitored and collected using TinyML
devices. The Swedish edge AI product business Imagimob has created a development
platform for machine learning on edge devices. Fifty-five organizations from throughout the
European Union have collaborated with Imagimob to learn how TinyML can offer efficient
management of crops and livestock.
Customer Experience
Personalization is a key marketing tool that customers demand as their expectations rise. The
idea is for businesses to understand their customers better and target them with ads and
messages that resonate with their behavior. Deploying edge TinyML applications enable
businesses to comprehend user contexts, including their behavior.
Workflow Requirements
Many tools and architectures deployed in traditional machine learning workflows are used
when building edge-device applications. The main difference is that TinyML allows these
models to perform various functions on smaller devices.
Tensorflow Lite for microcontrollers (TF Lite Micro) is one of the most popular
frameworks for machine learning on edge devices; it was specifically designed for the task of
implementing machine learning on embedded systems with only a few kilobytes of memory.
Python is often the preferred language for building machine learning models. However,
TensorFlow Lite makes it possible to easily develop models in C, C++, or Java and deploy
them without connecting to the internet.
From a hardware perspective, a supported microcontroller board is required o get started with
TinyML in TF Lite; the library currently supports the following microcontrollers:
Arduino Nano 33 BLE Sense
SparkFun Edge
Espressif ESP32-DevKitC
Espressif ESP-EYE
Wio Terminal: ATSAMD51
Himax WE-I Plus EVB Endpoint AI Development Board
With the support of TinyML, it is possible to increase the intelligence of billions of devices we
use every day, like home appliances and IoT gadgets, without spending a fortune on
expensive hardware or dependable internet connections, which are frequently constrained by
bandwidth and power and produce significant latency.
Learning Resources
TinyML Foundation
Tiny ML: Machine Learning with Tensorflow Lite on Arduino and Ultra-Low-Power
Microcontrollers (Book)
Embedded Machine Learning on Edge Devices (Podcast)
Understanding Machine Learning
Introduction to Deep Learning in Python
Deep Learning Tutorial
Wrap-up
TinyML has been gaining traction across various industries in recent years due to the
development of hardware and software ecosystems that support it. The tool has made it
possible to implement machine learning models in low-energy systems, like microcontrollers,
which opens the door to various new opportunities. Low latency, energy savings, data privacy,
and no connection dependencies are some of the factors that make TinyML so appealing to
developers seeking to develop applications for internet of things (IoT) devices.
TinyML And Its ‘Great’ Application in
IoT Technology
Tiny machine learning (TinyML) is an embedded software technology that can be used to
build low power consuming devices to run machine learning models. It is also more
famously referred to as the missing link between device intelligence and edge hardware.
It makes computing at edge cheaper, less expensive, and more stable. Further, TinyML
also facilitates improved response time, privacy, and low energy cost.
TinyML is massively growing in popularity with every passing year. As per ABI Research,
a global tech market advisory firm, by 2030, about 230 billion devices will be shipped with
TinyML chipset.
TinyML has the ability to provide a range of applications, from imagery micro-satellite,
wildfire detection, and for identifying crop ailments and animal illness. Another area of
application that is drawing great attention is its application in IoT devices.
Currently, there are 250 billion microcontrollers in the world today. This number is
growing by 30 billion annually. The reason for its pervasiveness is that, firstly, it gives
small devices the ability to make smart decisions without needing to send the data to the
cloud. Further, TinyML models are small enough to fit into almost any environment.
Taking the example of an imagery micro-satellite which are required to capture high-
resolution images but are restricted by the size and number of photos they can transmit
back to Earth. With TinyML, however, the microsatellite only captures an image if there
was an object of interest such as a ship or weather pattern.
TinyML has the potential to transform the way one deals with IoT data, where billions of
tiny devices are already used to provide greater efficiency in fields of medicine,
automation, and manufacturing.
It is very important to make a clear distinction between ‘serving’ machine learning to IoT
and ‘developing’ machine learning inside the IoT devices. In the former, the machine
learning tasks are outsourced to the cloud, while the IoT device waits for the execution of
intelligent services, however, in latter, TinyML-as-a-service is employed, and the IoT
device is part of the execution of the services. The TinyML represents a connecting point
between the IoT devices and the ML.
The hardware requirements for machine learning in larger systems are analogous to
TinyML in smaller IoT. As the size of IoT devices hitting the market increase, we could
see even higher investment in terms of research in TinyML, exploring concepts such as
deep neural networks, model compression, and deep reinforcement learning.
The Challenges
There are a few challenges of integrating TinyML in the IoT devices; some of them are:
TinyML is gaining its ground but is still in a very nascent stage. It is expected to take over
space with inter-sector applications very soon.