Get Started With Machine Learning

Learn how to train and use machine learning models with the Arduino Nano 33 BLE Rev2

This post was originally published by Sandeep Mistry and Dominic Pajak on the TensorFlow blog and uses the Arduino Nano 33 BLE Sense Rev2. This revision of the tutorial is adapted to the Arduino Nano 33 BLE Rev2 which comes without a microphone and therefore leaves out the speech recognition.

Introduction

Important notice! The TensorFlow Lite Micro Library is no longer available in the Arduino Library Manager. This library will need to be manually downloaded and included in your IDE.

In this article, we’ll show you how to install and run several TensorFlow Lite Micro examples.

We’ll talk about how you can train your custom gesture recognition model for Arduino using TensorFlow in Colab. This material is based on a practical workshop held by Sandeep Mistry and Don Coleman, an updated version of which is online here.

If you have previous experience with Arduino, you may be able to get these tutorials working within a couple of hours. If you’re entirely new to microcontrollers, it may take a bit longer.

Example 2: Training your own gesture classification model.

We’re excited to share some examples and tutorials, and to see what you will build from here. Let’s get started!

Note: The following projects are based on TensorFlow Lite for Microcontrollers.

Goals

Hardware & Software Needed

The Arduino Nano 33 BLE Rev2 has a 9-axis IMU (accelerometer, gyroscope, magnetometer) onboard, which gives it some potential for cool TinyML applications.

Unlike classic Arduino UNO, the board combines a microcontroller with onboard sensors which means you can address many use cases without additional hardware or wiring. The board is also small enough to be used in end applications like wearables. As the name suggests it has Bluetooth® Low Energy connectivity so you can send data (or inference results) to a laptop, mobile app or other Bluetooth® Low Energy boards and peripherals.

Tip: Sensors on a USB stick – Connecting the BLE board over USB is an easy way to capture data and add multiple sensors to single board computers without the need for additional wiring or hardware – a nice addition to a Raspberry Pi, for example.

Microcontrollers and TinyML

Microcontrollers, such as those used on Arduino boards, are low-cost, single-chip, self-contained computer systems. They’re the invisible computers embedded inside billions of everyday gadgets like wearables, drones, 3D printers, toys, rice cookers, smart plugs, e-scooters, and washing machines. The trend to connect these devices is part of what is referred to as the Internet of Things.

Arduino is an open-source platform and community focused on making microcontroller application development accessible to everyone. The board we’re using here has an Arm® Cortex®-M4 microcontroller running at 64 MHz with 1 MB Flash memory and 256 kB of RAM. This is tiny in comparison to cloud, PC, or mobile but reasonable by microcontroller standards.

The Arduino Nano 33 BLE Rev2 board comes in a tiny form factor
The Arduino Nano 33 BLE Rev2 board comes in a tiny form factor

There are practical reasons you might want to squeeze ML on microcontrollers, including:

  • Function – wanting a smart device to act quickly and locally (independent of the Internet).
  • Cost – accomplishing this with simple, lower-cost hardware.
  • Privacy – not wanting to share all sensor data externally.
  • Efficiency – smaller device form factor, energy-harvesting or longer battery life.

There’s a final goal that we’re building towards that is very important:

  • Machine learning can make microcontrollers accessible to developers who don’t have a background in embedded development

On the machine learning side, there are techniques you can use to fit neural network models into memory-constrained devices like microcontrollers. One of the key steps is the quantization of the weights from floating point to 8-bit integers. This also has the effect of making inference quicker to calculate and more applicable to lower clock-rate devices.

TinyML is an emerging field and there is still work to do – but what’s exciting is there’s a vast unexplored application space out there. Billions of microcontrollers combined with all sorts of sensors in all sorts of places can lead to some seriously creative and valuable TinyML applications in the future.

TensorFlow Lite for Microcontrollers Examples

The inference examples for TensorFlow Lite for Microcontrollers are now packaged and available through the Arduino Library Manager making it possible to include and run them on Arduino in a few clicks. In this section, we’ll show you how to run them. The examples are:

  • magic_wand – gesture recognition using the onboard IMU
  • person_detection – person detection using an external ArduCam® camera

For more background on the examples, you can take a look at the source in the TensorFlow repository. The models in these examples were previously trained. The tutorials below show you how to deploy and run them on an Arduino board. In the next section, we’ll discuss training.

How To Run The Examples Using the Arduino IDE

Follow the instructions in the next section setting up the Arduino IDE.

In the Arduino IDE, you will see the examples available via the File > Examples > Arduino_TensorFlowLite menu.

Select an example and the sketch will open. To compile, upload and run the examples on the board click the arrow icon:

For advanced users who prefer a command line, there is also the arduino-cli.
For advanced users who prefer a command line, there is also the arduino-cli.

Training a TensorFlow Lite Micro Model For Arduino

Gesture classification on Arduino Nano 33 BLE Rev2, output as emojis.

Next, we will use ML to enable the Arduino board to recognize gestures. We’ll capture motion data from the Arduino Nano 33 BLE Rev2 board, import it into TensorFlow to train a model, and deploy the resulting classifier onto the board.

The idea for this tutorial was based on Charlie Gerard’s awesome Play Street Fighter with body movements using Arduino and Tensorflow.js. In Charlie’s example, the board is streaming all sensor data from the Arduino board to another machine which performs the gesture classification in Tensorflow.js. We take this further and “TinyML-ify” it by performing gesture classification on the Arduino board itself. This is made easier in our case as the Arduino Nano 33 BLE Rev2 board we’re using has a more powerful Arm® Cortex®-M4 processor and an onboard IMU.

We’ve adapted the tutorial below, so no additional hardware is needed – the sampling starts once movement is detected. The original version of the tutorial adds a breadboard and a hardware button to press to trigger sampling. If you want to get into a little hardware, you can follow that version instead.

IDE Setup

1. First, let's make sure you have the Nano 33 BLE board package installed.

2. Also, let's make sure we have all the libraries we need installed.

Important notice! The TensorFlow Lite Micro Library is no longer available in the Arduino Library Manager. This library will need to be manually downloaded and included in your IDE.

There are more detailed Getting Started and Troubleshooting guides on the Arduino site if you need help.

Streaming Sensor Data From the Arduino Board

First, we need to capture some training data. You can capture sensor data logs from the Arduino board over the same USB cable you use to program the board with your laptop or PC.

Arduino boards run small applications (also called sketches) which are compiled from .ino format Arduino source code, and programmed onto the board using the IDE or Cloud Editor.

With the sketch we are creating we will do the following:

  • Monitor the board’s accelerometer and gyroscope
  • Trigger a sample window on detecting significant linear acceleration of the board
  • Sample for one second at 119 Hz, outputting CSV format data over USB
  • Loop back and monitor for the next gesture

The sensors we choose to read from the board, the sample rate, the trigger threshold, and whether we stream data output as CSV, JSON, binary or some other format are all customizable in the sketch running on the Arduino board. There is also scope to perform signal preprocessing and filtering on the device before the data is output to the log – this we can cover in another blog. For now, you can just upload the sketch and get a sampling.

The complete sketch can be found below:

1/*
2 IMU Capture
3 This example uses the on-board IMU to start reading acceleration and gyroscope
4 data from on-board IMU and prints it to the Serial Monitor for one second
5 when the significant motion is detected.
6 You can also use the Serial Plotter to graph the data.
7 The circuit:
8 - Arduino Nano 33 BLE Rev2 board.
9 Created by Don Coleman, Sandeep Mistry
10 Modified by Dominic Pajak, Sandeep Mistry, Hannes Siebeneicher
11 This example code is in the public domain.
12*/
13
14#include <Arduino_BMI270_BMM150.h>
15
16const float accelerationThreshold = 2.5; // threshold of significant in G's
17const int numSamples = 119;
18
19int samplesRead = numSamples;
20
21void setup() {
22 Serial.begin(9600);
23 while (!Serial);
24
25 if (!IMU.begin()) {
26 Serial.println("Failed to initialize IMU!");
27 while (1);
28 }
29
30 // print the header
31 Serial.println("aX,aY,aZ,gX,gY,gZ");
32}
33
34void loop() {
35 float aX, aY, aZ, gX, gY, gZ;
36
37 // wait for significant motion
38 while (samplesRead == numSamples) {
39 if (IMU.accelerationAvailable()) {
40 // read the acceleration data
41 IMU.readAcceleration(aX, aY, aZ);
42
43 // sum up the absolutes
44 float aSum = fabs(aX) + fabs(aY) + fabs(aZ);
45
46 // check if it's above the threshold
47 if (aSum >= accelerationThreshold) {
48 // reset the sample read count
49 samplesRead = 0;
50 break;
51 }
52 }
53 }
54
55 // check if the all the required samples have been read since
56 // the last time the significant motion was detected
57 while (samplesRead < numSamples) {
58 // check if both new acceleration and gyroscope data is
59 // available
60 if (IMU.accelerationAvailable() && IMU.gyroscopeAvailable()) {
61 // read the acceleration and gyroscope data
62 IMU.readAcceleration(aX, aY, aZ);
63 IMU.readGyroscope(gX, gY, gZ);
64
65 samplesRead++;
66
67 // print the data in CSV format
68 Serial.print(aX, 3);
69 Serial.print(',');
70 Serial.print(aY, 3);
71 Serial.print(',');
72 Serial.print(aZ, 3);
73 Serial.print(',');
74 Serial.print(gX, 3);
75 Serial.print(',');
76 Serial.print(gY, 3);
77 Serial.print(',');
78 Serial.print(gZ, 3);
79 Serial.println();
80
81 if (samplesRead == numSamples) {
82 // add an empty line if it's the last sample
83 Serial.println();
84 }
85 }
86 }
87}

Visualizing Live Sensor Data Log From the Arduino Board

With that done we can now visualize the data coming off the board. We’re not capturing data yet this is just to give you a feel for how the sensor data capture is triggered and how long a sample window is. This will help when it comes to collecting training samples.

  • In the Arduino IDE, open the Serial Plotter Tools > Serial Plotter
  • If you get an error that the board is not available, reselect the port: Tools > Port > portname (Arduino Nano 33 BLE)
  • Pick up the board and practice your punch and flex gestures
  • You’ll see it only sample for a one-second window, then wait for the next gesture
  • You should see a live graph of the sensor data capture (see GIF below)

Arduino IDE Serial Plotter will show a live graph of CSV data output from your board.

When you’re done be sure to close the Serial Plotter window – this is important as the next step won’t work otherwise.

Capturing Gesture Training Data

To capture data as a CSV log to upload to TensorFlow, you can use Arduino IDE > Tools > Serial Monitor to view the data and export it to your desktop machine:

  • Reset the board by pressing the small white button on the top
  • Pick up the board in one hand (picking it up later will trigger sampling)
  • In the Arduino IDE, open the Serial Monitor Tools > Serial Monitor
  • If you get an error that the board is not available, reselect the port:
  • Tools > Port > portname (Arduino Nano 33 BLE)
  • Make a punch gesture with the board in your hand (Be careful whilst doing this!)
  • Make the outward punch quickly enough to trigger the capture
  • Return to a neutral position slowly so as not to trigger the capture again
  • Repeat the gesture capture step 10 or more times to gather more data
  • Copy and paste the data from the Serial Console to a new text file called punch.csv
  • Clear the console window output and repeat all the steps above, this time with a flex gesture in a file called flex.csv
  • Make the inward flex fast enough to trigger capture returning slowly each time

Note: the first line of your two csv files should contain the fields aX,aY,aZ,gX,gY,gZ.

Data recorded by your movements
Data recorded by your movements

Linux tip: If you prefer you can redirect the sensor log output from the Arduino board straight to .csv file on the command line. With the Serial Plotter / Serial Monitor windows close use:

1$ cat /dev/cu.usbmodem[nnnnn] > sensorlog.csv

Training in TensorFlow

We’re going to use Google Colab to train our machine-learning model using the data we collected from the Arduino board in the previous section. Colab provides a Jupyter notebook that allows us to run our TensorFlow training in a web browser.

Arduino gesture recognition training collab.
Arduino gesture recognition training collab.

  • Set up Python environment
  • Upload the punch.csv and flex.csv data
  • Parse and prepare the data
  • Build and train the model
  • Convert the trained model to TensorFlow Lite
  • Encode the model in an Arduino header file

The final step of the collab is to generate the model.h file to download and include in our Arduino IDE gesture classifier project in the next section:

Gesture classifier
Gesture classifier

Let’s open the notebook in Colab and run through the steps in the cells – arduino_tinyml_workshop.ipynb

Classifying IMU Data

Next we will use model.h file we just trained and downloaded from Colab in the previous section of our Arduino IDE project:

We will be starting a new sketch, you will find the complete code below:

1/*
2 IMU Classifier
3 This example uses the on-board IMU to start reading acceleration and gyroscope
4 data from on-board IMU, once enough samples are read, it then uses a
5 TensorFlow Lite (Micro) model to try to classify the movement as a known gesture.
6 Note: The direct use of C/C++ pointers, namespaces, and dynamic memory is generally
7 discouraged in Arduino examples, and in the future the TensorFlowLite library
8 might change to make the sketch simpler.
9 The circuit:
10 - Arduino Nano 33 BLE Rev2 board.
11 Created by Don Coleman, Sandeep Mistry
12 Modified by Dominic Pajak, Sandeep Mistry
13 This example code is in the public domain.
14*/
15
16#include "Arduino_BMI270_BMM150.h"
17
18#include <TensorFlowLite.h>
19#include <tensorflow/lite/micro/all_ops_resolver.h>
20#include <tensorflow/lite/micro/micro_interpreter.h>
21#include <tensorflow/lite/schema/schema_generated.h>
22
23#include "model.h"
24
25const float accelerationThreshold = 2.5; // threshold of significant in G's
26const int numSamples = 119;
27
28int samplesRead = numSamples;
29
30// global variables used for TensorFlow Lite (Micro)
31tflite::MicroErrorReporter tflErrorReporter;
32
33// pull in all the TFLM ops, you can remove this line and
34// only pull in the TFLM ops you need, if would like to reduce
35// the compiled size of the sketch.
36tflite::AllOpsResolver tflOpsResolver;
37
38const tflite::Model* tflModel = nullptr;
39tflite::MicroInterpreter* tflInterpreter = nullptr;
40TfLiteTensor* tflInputTensor = nullptr;
41TfLiteTensor* tflOutputTensor = nullptr;
42
43// Create a static memory buffer for TFLM, the size may need to
44// be adjusted based on the model you are using
45constexpr int tensorArenaSize = 8 * 1024;
46byte tensorArena[tensorArenaSize] __attribute__((aligned(16)));
47
48// array to map gesture index to a name
49const char* GESTURES[] = {
50 "punch",
51 "flex"
52};
53
54#define NUM_GESTURES (sizeof(GESTURES) / sizeof(GESTURES[0]))
55
56void setup() {
57 Serial.begin(9600);
58 while (!Serial);
59
60 // initialize the IMU
61 if (!IMU.begin()) {
62 Serial.println("Failed to initialize IMU!");
63 while (1);
64 }
65
66 // print out the samples rates of the IMUs
67 Serial.print("Accelerometer sample rate = ");
68 Serial.print(IMU.accelerationSampleRate());
69 Serial.println(" Hz");
70 Serial.print("Gyroscope sample rate = ");
71 Serial.print(IMU.gyroscopeSampleRate());
72 Serial.println(" Hz");
73
74 Serial.println();
75
76 // get the TFL representation of the model byte array
77 tflModel = tflite::GetModel(model);
78 if (tflModel->version() != TFLITE_SCHEMA_VERSION) {
79 Serial.println("Model schema mismatch!");
80 while (1);
81 }
82
83 // Create an interpreter to run the model
84 tflInterpreter = new tflite::MicroInterpreter(tflModel, tflOpsResolver, tensorArena, tensorArenaSize);
85
86 // Allocate memory for the model's input and output tensors
87 tflInterpreter->AllocateTensors();
88
89 // Get pointers for the model's input and output tensors
90 tflInputTensor = tflInterpreter->input(0);
91 tflOutputTensor = tflInterpreter->output(0);
92}
93
94void loop() {
95 float aX, aY, aZ, gX, gY, gZ;
96
97 // wait for significant motion
98 while (samplesRead == numSamples) {
99 if (IMU.accelerationAvailable()) {
100 // read the acceleration data
101 IMU.readAcceleration(aX, aY, aZ);
102
103 // sum up the absolutes
104 float aSum = fabs(aX) + fabs(aY) + fabs(aZ);
105
106 // check if it's above the threshold
107 if (aSum >= accelerationThreshold) {
108 // reset the sample read count
109 samplesRead = 0;
110 break;
111 }
112 }
113 }
114
115 // check if the all the required samples have been read since
116 // the last time the significant motion was detected
117 while (samplesRead < numSamples) {
118 // check if new acceleration AND gyroscope data is available
119 if (IMU.accelerationAvailable() && IMU.gyroscopeAvailable()) {
120 // read the acceleration and gyroscope data
121 IMU.readAcceleration(aX, aY, aZ);
122 IMU.readGyroscope(gX, gY, gZ);
123
124 // normalize the IMU data between 0 to 1 and store in the model's
125 // input tensor
126 tflInputTensor->data.f[samplesRead * 6 + 0] = (aX + 4.0) / 8.0;
127 tflInputTensor->data.f[samplesRead * 6 + 1] = (aY + 4.0) / 8.0;
128 tflInputTensor->data.f[samplesRead * 6 + 2] = (aZ + 4.0) / 8.0;
129 tflInputTensor->data.f[samplesRead * 6 + 3] = (gX + 2000.0) / 4000.0;
130 tflInputTensor->data.f[samplesRead * 6 + 4] = (gY + 2000.0) / 4000.0;
131 tflInputTensor->data.f[samplesRead * 6 + 5] = (gZ + 2000.0) / 4000.0;
132
133 samplesRead++;
134
135 if (samplesRead == numSamples) {
136 // Run inferencing
137 TfLiteStatus invokeStatus = tflInterpreter->Invoke();
138 if (invokeStatus != kTfLiteOk) {
139 Serial.println("Invoke failed!");
140 while (1);
141 return;
142 }
143
144 // Loop through the output tensor values from the model
145 for (int i = 0; i < NUM_GESTURES; i++) {
146 Serial.print(GESTURES[i]);
147 Serial.print(": ");
148 Serial.println(tflOutputTensor->data.f[i], 6);
149 }
150 Serial.println();
151 }
152 }
153 }
154}
  • Create a new tab in the IDE. When asked name it model.h
  • Open the model.h tab and paste in the version you downloaded from Colab

Opening a new tab in your sketch
Opening a new tab in your sketch

  • Upload the sketch: Sketch > Upload
  • Open the Serial Monitor: Tools > Serial Monitor
  • Perform some gestures
  • The confidence of each gesture will be printed to the Serial Monitor (0 = low confidence, 1 = high confidence)
  • Congratulations you’ve just trained your first ML application for Arduino!

Guessing the gesture with a confidence score
Guessing the gesture with a confidence score

For added fun, the Emoji_Button.ino example shows how to create a USB keyboard that prints an emoji character in Linux and macOS. Try combining the Emoji_Button.ino example with the IMU_Classifier.ino sketch to create a gesture-controlled emoji keyboard.

Conclusion

It’s an exciting time with a lot to learn and explore in TinyML. We hope this blog has given you some idea of the potential and a starting point to start applying it in your own projects. Be sure to let us know what you build and share it with the Arduino community.

For a comprehensive background on TinyML and the example applications in this article, we recommend Pete Warden and Daniel Situnayake’s O’Reilly book “TinyML: Machine Learning with TensorFlow on Arduino and Ultra-Low Power Microcontrollers.”

Suggest changes

The content on docs.arduino.cc is facilitated through a public GitHub repository. If you see anything wrong, you can edit this page here.

License

The Arduino documentation is licensed under the Creative Commons Attribution-Share Alike 4.0 license.