Learn how to get started with Cellular, Wi-Fi, LoRa, and Satellite using Blues Notecard on August 28th

Blues Developers
What’s New
Resources
Blog
Technical articles for developers
Newsletter
The monthly Blues developer newsletter
Terminal
Connect to a Notecard in your browser
Developer Certification
Get certified on wireless connectivity with Blues
Webinars
Listing of Blues technical webinars
Blues.comNotehub.io
Shop
Docs
Button IconHelp
Notehub StatusVisit our Forum
Button IconSign In
Sign In
Sign In
Docs Home
What’s New
Resources
Blog
Technical articles for developers
Newsletter
The monthly Blues developer newsletter
Terminal
Connect to a Notecard in your browser
Developer Certification
Get certified on wireless connectivity with Blues
Webinars
Listing of Blues technical webinars
Blues.comNotehub.io
Shop
Docs
Guides & Tutorials
Collecting Sensor Data
Routing Data to Cloud
IntroductionCreate a RouteUse JSONata to Transform JSONRoute to an External ServiceBuild Data VisualizationsNext Steps
Building Edge AI Applications
Best Practices for Production-Ready Projects
Twilio SMS Guide
Fleet Admin Guide
Using the Notehub API
Notecard Guides
Guide Listing
Asset Tracking with GPS
Attention Pin Guide
Connecting to a Wi-Fi Access Point
Debugging with the FTDI Debug Cable
Diagnosing Cellular Connectivity Issues
Diagnosing GPS Issues
Encrypting and Decrypting Data with the Notecard
Feather MCU Low Power Management
Minimizing Latency
Notecard Communication Without a Library
Recovering a Bricked Notecard
Remote Command and Control
Sending and Receiving Large Binary Objects
Serial-Over-I2C Protocol
Understanding Environment Variables
Understanding Notecard Penalty Boxes
Updating ESP32 Host Firmware
Using External SIM Cards
Using JSONata to Transform JSON
Using Notecard Trace Mode
homechevron_rightDocschevron_rightGuides & Tutorialschevron_rightRouting Data to Cloud

Routing Data to Cloud: AWS

Watch a video of this tutorial

In previous tutorials you've learned about the Blues Notecard, and used it to collect data and send it to Notehub, the Blues cloud service.

One powerful feature of Notehub is routes, which allow you to forward your data from Notehub to a public cloud like AWS, Azure, or Google Cloud, to a data cloud like Snowflake, to dashboarding services like Datacake or Ubidots, or to a custom HTTP or MQTT-based endpoint. The tutorial below guides you through sending data to several popular services, and teaches you how to build visualizations using that data.

Get started with:
AWS
AWSAzure IoT CentralBlynkDatacakeGeneral HTTP/HTTPSGoogle Cloud PlatformInitial StateMQTTQubitroSnowflakeThingSpeakThingWorxUbidots

Don't see a cloud or backend that you need? Notehub is able to route data to virtually any provider. If you're having trouble setting up a route, reach out in our forum and we will help you out.

Introduction

This tutorial should take approximately 30-40 minutes to complete.

In this tutorial, you'll learn how to connect your Notecard-powered app to AWS, and learn how to start creating simple visualizations with sensor data.

This tutorial assumes you've already completed the initial Sensor Tutorial to capture sensor data, saved it in a Notefile called sensors.qo, and sent that data through the Notecard to Notehub (or that you've already created your own app with sensor data and are ready to connect your app to external services).

Create a Route

A Route is an external API, or server location, where Notes can be forwarded upon receipt.

Routes are defined in Notehub for a Project and can target Notes from one or more Fleets or all Devices. A Project can have multiple routes defined and active at any one time.

Before you create a Route, ensure the data you want to route is available in Notehub by navigating to the Events view.

We'll start with a simple route that will pass Notecard events through to webhook.site , where you can view the full payload sent by Notehub. Using this service is a useful way to debug routes, add a simple health-check endpoint to your app, and familiarize yourself with Notehub's Routing capabilities.

  1. Navigate to webhook.site . When the page loads, you'll be presented with a unique URL that you can use as a Route destination. Copy that URL for the next step.

    webhook.site service

  2. Navigate to the Notehub.io project for which you want to create a route and click on the Routes menu item in the left nav.

  3. Click Create Route.

    create a route button in notehub

  4. Select the General HTTP/HTTPS Request/Response route type.

    general https request route

  5. Give the route a name (for example, "Health").

    name notehub route

  6. For the Route URL, use the unique URL you obtained from webhook.site.

    url for notehub route

  7. In the Notefiles dropdown, choose Select Notefiles and enter the name of the Notefile to monitor. For example, we used sensors.qo for the sensor tutorial.

    name of the notefile

  8. Make sure the Enabled switch remains selected, and click Create Route.

  9. Return to webhook.site. This page will update automatically with data from your Notecard as it is received in Notehub. The data from your sensor is contained within the body attribute. Notice that Notehub provides you with a lot of information, by default. In the next section, we'll look at using transformations to customize what Notehub sends in a Route.

    notehub data in webhook.site

Use JSONata to Transform JSON

Before moving on to routing data to another external service, let's briefly explore using JSONata to transform the data Notehub routes.

As mentioned above, Notehub provides a lot of information in each Route request. You may want to trim down what you send to the external service, or you might need to transform the payload to adhere to a format expected by that service. Either way, Notehub supports shaping the data sent to a Route using JSONata.

More About JSONata

To learn more about JSONata, have a look at the Blues JSONata Guide.

Transform Your Data

Let's try a simple query to the webhook.site route created in the last section.

  1. Navigate to the Routes page in Notehub and click View next to the Route you wish to edit.

    view a route

  2. In the Transform JSON drop-down, select JSONata Expression.

    jsonata expression

  3. In the JSONata expression text area, add the following query to select the temp and humidity from the body, create a location field that concatenates the tower_location and tower_country fields, and create a time field.

    {
       "temp": body.temp,
       "humidity": body.humidity,
       "location": tower_location & ', ' & tower_country,
       "time": when
    }
  4. Click Save Route. Then, navigate back to your webhook.site url. As requests come in, you'll see your custom, JSONata-transformed payload in the Raw Content section.

    transformed data routed to webhook.site

JSONata is simple, powerful, and flexible, and will come in handy as you create Routes for your external services. To explore JSONata further, visit our JSON Fundamentals guide.

Route to an External Service

Now that you've created your first Route and learned how to use JSONata to shape the data sent by a Route, you'll connect Notehub to an external service.

In this section of the tutorial, you'll connect your app to AWS (Amazon Web Services). AWS is a comprehensive cloud computing platform that provides on-demand infrastructure, storage, machine learning, and developer tools to build, deploy, and scale applications globally.

Given the wide range of tools and services in the AWS ecosystem, there are many ways to route and visualize data. This tutorial takes a relatively straightforward approach:

  1. Route events from Notehub to an AWS Lambda function.
  2. Use the Lambda function to store events in an S3 bucket.
  3. Use QuickSight to visualize the events stored in S3.

Create an AWS Account

If you don't have one already, create an AWS account . Then, log in to your AWS Management Console .

Create an S3 Bucket

The first step is to create a repository (an S3 bucket) where you'll store JSON files (events) routed from Notehub.

  1. Navigate to the Amazon S3 section of the AWS Management Console and click the Create bucket button.

    create s3 bucket

  2. Leave the default selection of "general purpose" bucket type and provide a globally-unique name for the bucket.

    s3 bucket name

  3. When done configuring your S3 bucket, click the Create bucket button.

Create a Lambda Function

With your S3 bucket created and configured, it's time to write an AWS Lambda function that will receive events POSTed from Notehub and store them in S3.

  1. Navigate to the Lambda section of the AWS Management Console and click on the Create function button.

  2. Provide a function name and choose the appropriate runtime. This tutorial provides code for a Lambda function written in Python, but there are other runtime options available.

    lambda function setup

  3. Under Additional Configurations, check to enable the creation of a unique Function URL and (optionally) choose NONE as the Auth type.

    note

    A production-ready version of this Lambda would have you use AWS_IAM as the Auth type. That does require additional configuration in setting up IAM user(s) and role(s) and is outside the scope of this tutorial.

    lambda function additional configurations

  4. After the Lambda function is initialized, navigate to the Code tab and paste in the following Python code, replacing your-s3-bucket-id with your S3 bucket name.

    Please note that this Lambda also converts JSON to NDJSON (Newline Delimited JSON). This is useful if you choose to later use AWS Athena as a query service.

    import os
    import json
    import time
    import base64
    import boto3
    from typing import Iterable
    
    s3 = boto3.client("s3")
    BUCKET = "your-s3-bucket-id"
    
    def to_ndjson(payload) -> str:
        def _dump(obj) -> str:
            return json.dumps(obj, ensure_ascii=False, separators=(',', ':'))
    
        if isinstance(payload, (list, tuple)):
            lines: Iterable[str] = (_dump(item) for item in payload)
            return "\n".join(lines) + "\n" if payload else ""
        else:
            return _dump(payload) + "\n"
    
    def lambda_handler(event, context):
        if "body" not in event:
            return {
                "statusCode": 400,
                "headers": {"Content-Type": "application/json"},
                "body": json.dumps({"error": "Missing body"})
            }
    
        raw_body = event["body"]
        if event.get("isBase64Encoded"):
            raw_body = base64.b64decode(raw_body).decode("utf-8")
    
        try:
            payload = json.loads(raw_body)
        except json.JSONDecodeError:
            ndjson = raw_body if raw_body.endswith("\n") else raw_body + "\n"
        else:
            ndjson = to_ndjson(payload)
    
        key = f"{int(time.time())}.ndjson"
    
        s3.put_object(
            Bucket=BUCKET,
            Key=key,
            Body=ndjson.encode("utf-8"),
            ContentType="application/x-ndjson"
        )
    
        public_url = f"https://{BUCKET}.s3.amazonaws.com/{key}"
    
        return {
            "statusCode": 200,
            "headers": {"Content-Type": "application/json"},
            "body": json.dumps({
                "message": "Saved",
                "bucket": BUCKET,
                "key": key,
                "url": public_url
            })
        }
  5. Next, click to Deploy the Lambda and copy the provided Function URL.

    lambda function deploy

Configure IAM Role for Lambda

With the Lambda function deployed, you do have to perform one small configuration edit to the default role created for this Lambda to allow it to PUT objects in the S3 bucket.

  1. Navigate to the IAM section of the AWS Management Console and edit the Role that was created when deploying the Lambda function. The name of the role likely begins with the name of the Lambda function itself (e.g. bluesLambdaFunction-role-xxxxx).

  2. In the Permissions policies section, click on the name of the policy you would like to edit (there is likely only one listed).

  3. Click the Edit button. In the provided Policy editor add the following JSON to the Statement array (being sure to replace your-s3-bucket-id with your S3 bucket name).

    {
      "Sid": "PutNDJSONToBucket",
      "Effect": "Allow",
      "Action": [
        "s3:PutObject",
        "s3:AbortMultipartUpload"
      ],
      "Resource": "arn:aws:s3:::your-s3-bucket-id/*"
    }
  4. Click the Next and Save changes buttons to save your changes.

Create a Route in Notehub

  1. Back in Notehub, navigate to the Routes menu option and click the Create Route button.

    create a notehub route

  2. Scroll until you see the AWS route type and click the Select button.

    create an aws route

  3. In the Configuration section, choose Lambda as the Type, and paste in the Lambda Function URL in the URL field.

    aws route configuration

  4. Under the Filters section, choose to only sync the sensors.qo Notefile. Under Data, choose JSONata Transform and paste in the following JSONata expression:

    {
       "temp": body.temp,
       "humidity": body.humidity,
       "time": when
    }

    complete notehub route configuration

  5. When you're done configuring your Notehub Route, click the Apply Changes button to save the Route.

Test the Route and View Data in S3

Before proceeding to creating visualizations based on this data, it's a good idea to test the Route and view the raw JSON saved in the S3 bucket.

Assuming sensor data is still being sent from your Notecard to Notehub in the sensors.qo Notefile, you should see individual JSON (NDJSON) files created in the S3 bucket. Each file is named with the UNIX Epoch timestamp when it was created in S3.

view json file in s3

The data should look something like this:

{"humidity":27.524,"temp":21.003,"time":1754491388}
note

If you do not see any files created in S3, there two places to look to debug the Route:

  1. Check the Logs tab of your Notehub Route to see if AWS returned any error messages.

  2. Navigate to CloudWatch in the AWS Management Console to find any error logs related to your Lambda function.

As new sensor readings are delivered to Notehub, they will be actively routed to your AWS Lambda function and saved in S3. Your next step is to create a visualization of your data using AWS QuickSight.

Build Data Visualizations

Now that you have connected your Notehub Route to AWS, you can leverage the power of the AWS ecosystem and set up an AWS QuickSight dashboard to visualize your data.

Create an AWS QuickSight Account

  1. Navigate to AWS QuickSight and register for a QuickSight account.

    create quicksight account

    warning

    Unlike most other AWS services, QuickSight is not billed based on usage. You will be charged a fee after 30 days whether or not you continue to use QuickSight!

  2. When setting up your account, use the default values provided. However, be sure to select the S3 bucket created earlier so QuickSight has access to it.

    quicksight s3 bucket

  3. Next, you'll need to create a manifest file. This effectively tells QuickSight where it will find the data it needs to populate the dashboard. Save the following as quicksight-manifest.json (replacing your-s3-bucket-id with your S3 bucket name). Save this file locally for now.

    {
      "fileLocations":[
        { "URIPrefixes":[ "s3://your-s3-bucket-id/" ] }
      ],
      "globalUploadSettings":{ "format":"JSON" }
    }
  4. Navigate to the Datasets menu and click the NEW DATASET button.

    quicksight new dataset

  5. Choose the Amazon S3 tile, provide a name for the Data source and upload the manifest file created in the previous step. Click Connect to proceed.

    quicksight upload manifest

  6. Next, click the Visualize button and you'll be provided with a New Sheet on which to create your visualizations. Click the CREATE button to begin.

    quicksight create new sheet

  7. Before assembling the chart, you need to convert the time field from a number (as interpeted by QuickSight) to a UNIX Epoch timestamp. Under the Data menu, choose Datasets and edit your dataset.

    Then, add a new calculated field called timestamp.

    quicksight create calculated field

    The formula for the calculated field is this:

    epochDate({time})
  8. Click on the Publish & Visualize button when you are done.

  9. Finally, in the QuickSight sheet provided, add your calculated field timestamp to the x-axis of the Line Chart. Set the aggregation option to "second" (or whatever works best for your data). Add the temp and humidity values to the value section and set both aggregation options to "average".

    From there, you should see a basic visualization of your data in QuickSight!

    quicksight line chart configuration

    note

    By default, your QuickSight dashboard will need its data refreshed from S3 manually from the Data > Dataset menu. However, you can schedule automatic refresh of data from S3 using these instructions .

Next Steps

Congratulations! You've created your first Route and connected your Notecard app to an external service.

If you're following the Blues Quickstart, you're done! But we do have some suggestions for next steps:

  1. Browse the Blues Example Apps to find open-source example applications, code snippets, and best practices.
  2. Bookmark the Notecard API for quick reference.
  3. Follow the Notehub Walkthrough to get more out of using Notehub.

At any time, if you find yourself stuck, please reach out on the community forum .

Can we improve this page? Send us feedback
© 2025 Blues Inc.
© 2025 Blues Inc.
TermsPrivacy
Notecard Disconnected
Having trouble connecting?

Try changing your USB cable as some cables do not support transferring data. If that does not solve your problem, contact us at support@blues.com and we will get you set up with another tool to communicate with the Notecard.

Advanced Usage

The help command gives more info.

Connect a Notecard
Use USB to connect and start issuing requests from the browser.
Try Notecard Simulator
Experiment with Notecard's latest firmware on a Simulator assigned to your free Notehub account.

Don't have an account? Sign up