🌡️  Monitor legacy analog systems with TinyML, Edge Impulse, and Blues Wireless. Learn more in our webinar on May 26th!

Search
Documentation Results
End of results
Community Results
End of results
Support
Blues.io
Notehub.io
Shop
Sign In
Search
Documentation Results
End of results
Community Results
End of results
Support
Blues.io
Notehub.io
Shop
×
HomeSwan
Introduction to Swan
Swan QuickstartSwan Datasheet
Using Swan With Edge ImpulseBefore You StartForwarding Data to Edge ImpulseBuild and Train an ML ModelTest the ModelDeploy the Model to Swan
Rate this page  
  • ★
    ★
  • ★
    ★
  • ★
    ★
  • ★
    ★
  • ★
    ★

Machine Learning with Swan & Edge Impulse

The Swan from Blues Wireless is a powerful MCU. Built around a 120 MHz Arm Cortex-M4 from STMicroelectronics with 2 MB of Flash and 640 KB of RAM, it's the perfect board for deploying edge machine learning models. In this guide, you'll learn how to build and run ML models using the Swan and Edge Impulse , a leading development platform for edge machine learning.

note

You can find the complete source for this guide on GitHub. Specifically, we provide an accelerometer data collector sketch , an example edge ML model , and a data classification sketch .

Before You Start

This tutorial and the associated sample code provide instructions for building an edge ML application using the Swan and Edge Impulse. The instructions detail the creation of a gesture classification model using an accelerometer, but can be adapted to fit any sensor or ML application.

If you wish to follow along with this guide, you'll need the following:

  • A Swan MCU .
  • A triple axis accelerometer. We used the Adafruit LIS3DH for this example.
  • Create an account and set up a new project with Edge Impulse .

You'll also need to install the following:

  • Node.js v12 or later.

  • The edge-impulse-cli, which you can install with the following command from a terminal:

    npm install -g edge-impulse-cli

    Using Arduino IDE?

    You'll need to install the appropriate library for your accelerometer from the Arduino Library Manager. From the Tools > Manage Libraries... menu, search for Adafruit LIS3DH and click "Install".

    Image of the Arduino library manager UI with the LIS3DH accelerometer library selected

    Using PlatformIO?

    Add adafruit/Adafruit LIS3DH@^1.2.3 to the lib_deps section of your platformio.ini file.

    Forwarding Data to Edge Impulse

    For this example, we will create a simple classification model with an accelerometer designed to analyze movement over a brief period of time (2 seconds) and infer how the motion correlates to one of the following three states:

    1. Idle (no motion)
    2. Chop (motioning away and back towards your body)
    3. Wave (motioning side-to-side across your body)

    The first step in building any ML model is data acquisition and labeling, and Edge Impulse makes this simple with the data forwarder and Edge Impulse Studio.

    To get started collecting data from your Swan and accelerometer, you'll load a program to the device that streams X, Y, and Z values from the accelerometer to the serial port on the Swan. Edge Impulse's data forwarder can then read the output of your device and forward it to Edge Impulse studio, where you can label the data and build up a dataset for training.

    note

    If you've not yet configured your Swan, visit the quick start for instructions.

    In your preferred IDE, create a new sketch that will serve as the data acquisition program:

    #include <Adafruit_LIS3DH.h>
    
    Adafruit_LIS3DH lis = Adafruit_LIS3DH();
    
    void setup(void)
    {
      Serial.begin(115200);
      lis.begin(0x18);
    }
    
    void loop()
    {
      lis.read(); // get x,y,z data at once
      Serial.print(lis.x);
      Serial.print("\t");
      Serial.print(lis.y);
      Serial.print("\t");
      Serial.print(lis.z);
      Serial.println();
    }

    After initializing the accelerometer, this example sketch will output raw accelerometer data to the serial console.

    Run this program on your Swan, using the Upload button in the Arduino IDE or Upload function in PlatformIO. Once running, you can connect to the serial monitor to verify that the program is running and outputting a stream of accelerometer values, like so:

    -224    -528    16192
    -320    -528    16192
    112     -624    15792
    112     -624    15792
    304     -496    16160
    304     -496    16160
    80      -640    16384
    80      -640    16384
    -112    -640    16384

    Once your Swan is emitting accelerometer values, the next step is to capture readings while performing the gestures you want to classify, and forward these to Edge Impulse Studio.

    To start the forwarder, make sure you've closed your serial monitor and run the following from a terminal:

    edge-impulse-data-forwarder

    Follow the prompts to log into your Edge Impulse account, select a project, and assign names to the X, Y, and Z values from your accelerometer.

    SDK data forwarder console UI

    Once instructed to do so, use the URL provided by the forwarder to open Edge Impulse Studio and start capturing data.

    In the "Record new data" form, make sure your device is selected, set a label for the gesture you plan to perform, and set the sample length to 10000 ms (10 seconds). Then, click "Start sampling".

    Edge Impulse Studio data capture UI

    At this point, Edge Impulse Studio will send a message to the forwarder running on your computer and instruct it to capture a ten second sample from the accelerometer. Once the capture starts, perform a gesture you wish to classify repeatedly for the entire ten seconds.

    SDK terminal view

    After the sample is captured, it's uploaded to Edge Impulse Studio, where you can see a visual view of the accelerometer data captured by the forwarder.

    Edge Impulse Studio single data point capture

    Now comes the fun part! To get a solid cross-section of data for your model, you'll want to repeat this process multiple times for each gesture you want to classify in your application.

    The more data you capture, the better your model will be, so take the time to record at least a few minutes worth of data for each gesture.

    Edge Impulse Studio final data captured

    Finally, allocate 10-20% of your captured data to your test set. This data will be set aside and not used to training your model. Doing so will allow you to more accurately test your model before deploying it.

    Edge Impulse Studio move data to test set

    Build and Train an ML Model

    Once you've captured enough data from the Swan, you're ready to move on to designing and building your model. In the left-side menu of Edge Impulse Studio, click "Create impulse" under the Impulse Design menu. Based on the data you collected, Edge Impulse Studio will recognize that your source data set is a time series and will make some recommendations for the window size and increase to use, which you are free to adjust as you test and iterate on the model.

    Edge Impulse Studio time series data source

    Next, click on the "Add a processing block" button and add a "Spectral Analysis" block.

    Edge Impulse Studio selecting processing block

    Then, click the "Add a learning block" button and select the "Classification (Keras)" block.

    Edge Impulse Studio selecting classification block

    Finally, click "Save Impulse".

    Edge Impulse Studio impulse design screen

    On the next screen, "Spectral features", you can adjust and tune parameters if you wish, or keep the defaults.

    Click "Save parameters" when done and you'll be taken to the "Generate features" screen.

    Click the "Generate features" button and, once the job completes, you'll get some estimates of on-device inference performance and a feature explorer that shows how your gestures are clustering on their x, y, and z axes.

    You can pan, zoom and scroll around in that view to drill into your sensor data.

    Edge Impulse Studio feature generation

    Once you're done playing with the feature explorer, click on the "NN Classifier" item in the left nav, and adjust the number of training cycles to 300 and learning rate to 0.0005. Then, click "Start training" and grab a cup of coffee while Edge Impulse trains your model.

    Edge Impulse Studio training in progress

    After training is complete, Edge Impulse Studio will display the performance of the model, a confusion matrix, feature explorer, and on-device performance details.

    Edge Impulse Studio training complete screen

    Test the Model

    There are two different ways to test your model in Edge Impulse Studio before you deploy it to the Swan: "Live Classification" and "Model Testing".

    Live Classification

    You can use the Edge Impulse data forwarder to perform live classification against the model you've just trained. First, make sure the Swan is connected to your computer and start the Edge Impulse data forwarder:

    edge-impulse-data-forwarder

    Next, in Edge Impulse Studio, click the "Live classification" menu item in the left nav. Make sure your device is selected and click "Start sampling."

    Live classification UI

    After the initialization delay, perform the gesture you want to classify. Once captured from the device, Edge Impulse runs the raw data against the model and displays the classification results. Using the UI, you can set the expected outcome and even move results into your training set for model refinement.

    Results of a live test

    Model Testing

    Recall when we earlier set aside a small percentage of data samples in the test set? Click on the "Model testing" menu item. You'll see all of your test samples and have the ability to "Classify all" which will run each sample through the classifier.

    Results of model testing

    Deploy the Model to Swan

    Once you're happy with the performance of your model, it's time to deploy it to your device!

    Click on the "Deployment" menu item and then click the "Arduino library" button.

    Edge Impulse model download screen with the Arduino option selected

    Under the "Select optimizations" section, you can select either a quantized model or unoptimized, depending on your needs. The Swan has plenty of RAM and Flash, so you can choose the unoptimized model if it provides better performance without sacrificing too much speed.

    Edge Impulse model optimization screen

    Once you've selected a model, click Build. Edge Impulse Studio will build a model package and deliver a zip archive for you to download.

    Edge Impulse Studio Model download screen

    Alter the Edge Impulse SDK

    Before including the downloaded library in your project, we need to make a minor tweak to the Edge Impulse SDK to work with our STM32-based Swan.

    Unzip the file you downloaded from Edge Impulse Studio and open the config.hpp file under src/edge-impulse-sdk/dsp in a text editor. Then add the following two lines around line 25:

    #define EIDSP_USE_CMSIS_DSP             1
    #define EIDSP_LOAD_CMSIS_DSP_SOURCES    1

      Using Arduino IDE?

      You'll need to include this folder as a library in the Arduino IDE so that your program can include the model and sdk. Open a new Arduino Sketch, click the Sketch menu then Include Library > Add .ZIP Library...

      Arduino add Zip library UI

      Navigate to the unzipped folder you just modified, select it, and click the "choose" button.

      Now, head back to Sketch > Include Library, and under Contributed libraries choose the name of the folder you selected. This will add the appropriate include statement to your sketch.

      Using PlatformIO?

      You'll need to include this folder in your PlatformIO project's lib directory. Do so by simply dragging-and-dropping the folder onto your lib directory.

      PlatformIO add model

      Write Your Sketch

      With the library installed in your project, you're ready to perform edge inferencing with the Swan!

      You can clone this example sketch and customize it for your sensor. Note that this sketch uses NeoPixels for a light show when an inference is generated (simply comment out the NeoPixels code if you don't need it).

      The important piece of code reads from the sensor and runs the Edge Impulse classifier with the captured data:

      float buffer[EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE] = {0};
      
      for (size_t ix = 0; ix < EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE; ix += 3)
      {
          // Determine the next tick (and then sleep later)
          uint64_t next_tick = micros() + (EI_CLASSIFIER_INTERVAL_MS * 1000);
      
          lis.read();
          buffer[ix] = lis.x;
          buffer[ix + 1] = lis.y;
          buffer[ix + 2] = lis.z;
      
          delayMicroseconds(next_tick - micros());
      }
      
      // Turn the raw buffer in a signal which we can the classify
      signal_t signal;
      int err = numpy::signal_from_buffer(buffer, EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE, &signal);
      if (err != 0)
      {
          ei_printf("Failed to create signal from buffer (%d)\n", err);
          return;
      }
      
      // Run the classifier
      ei_impulse_result_t result = {0};
      
      err = run_classifier(&signal, &result, debug_nn);
      if (err != EI_IMPULSE_OK)
      {
          ei_printf("ERR: Failed to run classifier (%d)\n", err);
          return;
      }
      
      ei_printf("(DSP: %d ms., Classification: %d ms., Anomaly: %d ms.)\n",
                  result.timing.dsp, result.timing.classification, result.timing.anomaly);
      uint8_t predictionLabel = 0;
      for (size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++)
      {
          Serial.print("    ");
          Serial.print(result.classification[ix].label);
          Serial.print(": ");
          Serial.println(result.classification[ix].value);
      
          if (result.classification[ix].value > result.classification[predictionLabel].value)
              predictionLabel = ix;
      }
      
      // print the predictions
      String label = result.classification[predictionLabel].label;
      
      Serial.print("\nPrediction: ");
      Serial.println(label);

      Deploy to Swan

      The final step is to upload the program to your Swan, open the serial monitor, perform some gestures, and see some inferences. The sample above performs continuous classification every few seconds, so you will see a steady stream of output in the monitor.

      (DSP: 37 ms., Classification: 0 ms., Anomaly: 0 ms.)
          chop: 0.00
          idle: 0.98
          wave: 0.02
      
      Prediction: idle
      
      Starting inferencing in 2 seconds...
      Sampling...
      (DSP: 36 ms., Classification: 0 ms., Anomaly: 0 ms.)
          chop: 0.00
          idle: 0.00
          wave: 1.00
      
      Prediction: wave
      
      Starting inferencing in 2 seconds...
      Sampling...
      (DSP: 36 ms., Classification: 0 ms., Anomaly: 0 ms.)
          chop: 1.00
          idle: 0.00
          wave: 0.00
      
      Prediction: chop

      Congratulations! You've built an edge ML model with Edge Impulse and the Swan!

      Additional Resources

      • Edge Impulse Docs
      • Swan Quickstart Guide
      Can we improve this page? Send us feedbackRate this page
      • ★
        ★
      • ★
        ★
      • ★
        ★
      • ★
        ★
      • ★
        ★
      © 2022 Blues Inc.Terms & ConditionsPrivacy
      blues.ioTwitterLinkedInGitHubHackster.io
      Disconnected
      Disconnected
      Having trouble connecting?

      Try changing your Micro USB cable as some cables do not support transferring data. If that does not solve your problem, contact us at support@blues.com and we will get you set up with another tool to communicate with the Notecard.

      Connect a NotecardClick 'Connect' and select a USB-connected Notecard to start issuing requests from the browser.