The Swan from Blues Wireless is a powerful MCU. Built around a 120 MHz Arm Cortex-M4 from STMicroelectronics with 2 MB of Flash and 640 KB of RAM, it's the perfect board for deploying edge machine learning models. In this guide, you'll learn how to build and run ML models using the Swan and Edge Impulse, a leading development platform for edge machine learning.
You can find the complete source for this guide on GitHub. Specifically, we provide an accelerometer data collector sketch, an example edge ML model, and a data classification sketch.
This tutorial and the associated sample code provide instructions for building an edge ML application using the Swan and Edge Impulse. The instructions detail the creation of a gesture classification model using an accelerometer, but can be adapted to fit any sensor or ML application.
If you wish to follow along with this guide, you'll need the following:
- A Swan MCU.
- A triple axis accelerometer. We used the Adafruit LIS3DH for this example.
- Create an account and set up a new project with Edge Impulse.
You'll also need to install the following:
Node.js v12 or later.
The
edge-impulse-cli
, which you can install with the following command from a terminal:npm install -g edge-impulse-cli
For this example, we will create a simple classification model with an accelerometer designed to analyze movement over a brief period of time (2 seconds) and infer how the motion correlates to one of the following three states:
- Idle (no motion)
- Chop (motioning away and back towards your body)
- Wave (motioning side-to-side across your body)
The first step in building any ML model is data acquisition and labeling, and Edge Impulse makes this simple with the data forwarder and Edge Impulse Studio.
To get started collecting data from your Swan and accelerometer, you'll
load a program to the device that streams X
, Y
, and Z
values
from the accelerometer to the serial port on the Swan. Edge Impulse's data
forwarder can then read the output of your device and forward it to Edge
Impulse studio, where you can label the data and build up a dataset for
training.
If you've not yet configured your Swan, visit the quick start for instructions.
In your preferred IDE, create a new sketch that will serve as the data acquisition program:
#include <Adafruit_LIS3DH.h> Adafruit_LIS3DH lis = Adafruit_LIS3DH(); void setup(void) { Serial.begin(115200); lis.begin(0x18); } void loop() { lis.read(); // get x,y,z data at once Serial.print(lis.x); Serial.print("\t"); Serial.print(lis.y); Serial.print("\t"); Serial.print(lis.z); Serial.println(); }
After initializing the accelerometer, this example sketch will output raw accelerometer data to the serial console.
Run this program on your Swan, using the Upload
button in the Arduino IDE or
Upload
function in PlatformIO. Once running, you can connect to the serial
monitor to verify that the program is running and outputting a stream of
accelerometer values, like so:
-224 -528 16192 -320 -528 16192 112 -624 15792 112 -624 15792 304 -496 16160 304 -496 16160 80 -640 16384 80 -640 16384 -112 -640 16384
Once your Swan is emitting accelerometer values, the next step is to capture readings while performing the gestures you want to classify, and forward these to Edge Impulse Studio.
To start the forwarder, make sure you've closed your serial monitor and run the following from a terminal:
edge-impulse-data-forwarder
Follow the prompts to log into your Edge Impulse account, select a project,
and assign names to the X
, Y
, and Z
values from your accelerometer.
Once instructed to do so, use the URL provided by the forwarder to open Edge Impulse Studio and start capturing data.
In the "Record new data" form, make sure your device is selected, set a label for the gesture you plan to perform, and set the sample length to 10000 ms (10 seconds). Then, click "Start sampling".
At this point, Edge Impulse Studio will send a message to the forwarder running on your computer and instruct it to capture a ten second sample from the accelerometer. Once the capture starts, perform a gesture you wish to classify repeatedly for the entire ten seconds.
After the sample is captured, it's uploaded to Edge Impulse Studio, where you can see a visual view of the accelerometer data captured by the forwarder.
Now comes the fun part! To get a solid cross-section of data for your model, you'll want to repeat this process multiple times for each gesture you want to classify in your application.
The more data you capture, the better your model will be, so take the time to record at least a few minutes worth of data for each gesture.
Finally, allocate 10-20% of your captured data to your test set. This data will be set aside and not used to training your model. Doing so will allow you to more accurately test your model before deploying it.
Once you've captured enough data from the Swan, you're ready to move on to designing and building your model. In the left-side menu of Edge Impulse Studio, click "Create impulse" under the Impulse Design menu. Based on the data you collected, Edge Impulse Studio will recognize that your source data set is a time series and will make some recommendations for the window size and increase to use, which you are free to adjust as you test and iterate on the model.
Next, click on the "Add a processing block" button and add a "Spectral Analysis" block.
Then, click the "Add a learning block" button and select the "Classification (Keras)" block.
Finally, click "Save Impulse".
On the next screen, "Spectral features", you can adjust and tune parameters if you wish, or keep the defaults.
Click "Save parameters" when done and you'll be taken to the "Generate features" screen.
Click the "Generate features" button and, once the job completes, you'll get some estimates of on-device inference performance and a feature explorer that shows how your gestures are clustering on their x, y, and z axes.
You can pan, zoom and scroll around in that view to drill into your sensor data.
Once you're done playing with the feature explorer, click on the "NN Classifier"
item in the left nav, and adjust the number of training cycles to 300
and
learning rate to 0.0005
. Then, click "Start training" and grab a cup of
coffee while Edge Impulse trains your model.
After training is complete, Edge Impulse Studio will display the performance of the model, a confusion matrix, feature explorer, and on-device performance details.
There are two different ways to test your model in Edge Impulse Studio before you deploy it to the Swan: "Live Classification" and "Model Testing".
You can use the Edge Impulse data forwarder to perform live classification against the model you've just trained. First, make sure the Swan is connected to your computer and start the Edge Impulse data forwarder:
edge-impulse-data-forwarder
Next, in Edge Impulse Studio, click the "Live classification" menu item in the left nav. Make sure your device is selected and click "Start sampling."
After the initialization delay, perform the gesture you want to classify. Once captured from the device, Edge Impulse runs the raw data against the model and displays the classification results. Using the UI, you can set the expected outcome and even move results into your training set for model refinement.
Recall when we earlier set aside a small percentage of data samples in the test set? Click on the "Model testing" menu item. You'll see all of your test samples and have the ability to "Classify all" which will run each sample through the classifier.
Once you're happy with the performance of your model, it's time to deploy it to your device!
Click on the "Deployment" menu item and then click the "Arduino library" button.
Under the "Select optimizations" section, you can select either a quantized model or unoptimized, depending on your needs. The Swan has plenty of RAM and Flash, so you can choose the unoptimized model if it provides better performance without sacrificing too much speed.
Once you've selected a model, click Build. Edge Impulse Studio will build a model package and deliver a zip archive for you to download.
Before including the downloaded library in your project, we need to make a minor tweak to the Edge Impulse SDK to work with our STM32-based Swan.
Unzip the file you downloaded from Edge Impulse Studio and open the
config.hpp
file under src/edge-impulse-sdk/dsp
in a text editor. Then
add the following two lines around line 25:
#define EIDSP_USE_CMSIS_DSP 1 #define EIDSP_LOAD_CMSIS_DSP_SOURCES 1
With the library installed in your project, you're ready to perform edge inferencing with the Swan!
You can clone this example sketch and customize it for your sensor. Note that this sketch uses NeoPixels for a light show when an inference is generated (simply comment out the NeoPixels code if you don't need it).
The important piece of code reads from the sensor and runs the Edge Impulse classifier with the captured data:
float buffer[EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE] = {0}; for (size_t ix = 0; ix < EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE; ix += 3) { // Determine the next tick (and then sleep later) uint64_t next_tick = micros() + (EI_CLASSIFIER_INTERVAL_MS * 1000); lis.read(); buffer[ix] = lis.x; buffer[ix + 1] = lis.y; buffer[ix + 2] = lis.z; delayMicroseconds(next_tick - micros()); } // Turn the raw buffer in a signal which we can the classify signal_t signal; int err = numpy::signal_from_buffer(buffer, EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE, &signal); if (err != 0) { ei_printf("Failed to create signal from buffer (%d)\n", err); return; } // Run the classifier ei_impulse_result_t result = {0}; err = run_classifier(&signal, &result, debug_nn); if (err != EI_IMPULSE_OK) { ei_printf("ERR: Failed to run classifier (%d)\n", err); return; } ei_printf("(DSP: %d ms., Classification: %d ms., Anomaly: %d ms.)\n", result.timing.dsp, result.timing.classification, result.timing.anomaly); uint8_t predictionLabel = 0; for (size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++) { Serial.print(" "); Serial.print(result.classification[ix].label); Serial.print(": "); Serial.println(result.classification[ix].value); if (result.classification[ix].value > result.classification[predictionLabel].value) predictionLabel = ix; } // print the predictions String label = result.classification[predictionLabel].label; Serial.print("\nPrediction: "); Serial.println(label);
The final step is to upload the program to your Swan, open the serial monitor, perform some gestures, and see some inferences. The sample above performs continuous classification every few seconds, so you will see a steady stream of output in the monitor.
(DSP: 37 ms., Classification: 0 ms., Anomaly: 0 ms.) chop: 0.00 idle: 0.98 wave: 0.02 Prediction: idle Starting inferencing in 2 seconds... Sampling... (DSP: 36 ms., Classification: 0 ms., Anomaly: 0 ms.) chop: 0.00 idle: 0.00 wave: 1.00 Prediction: wave Starting inferencing in 2 seconds... Sampling... (DSP: 36 ms., Classification: 0 ms., Anomaly: 0 ms.) chop: 1.00 idle: 0.00 wave: 0.00 Prediction: chop
Congratulations! You've built an edge ML model with Edge Impulse and the Swan!