This application continuously grabs images from an OpenMV camera board running MicroPython and runs them through an object detection model built using Edge Impulse to test for the presence of a moving car. If a moving car is detected, the speed of that car is measured using an OmniPreSense Doppler radar sensor. If that speed exceeds a configurable speed limit, the speed and current time are recorded and synced to Notehub via a Notecard.
A 5V, 1A power supply is sufficient to power all the hardware used in this project. Plug the power supply into an outlet and plug the other end into the adapter. Insert two male-to-male jumper wires into the screw terminal block on the other end of the adapter. Connect the free ends of the jumpers to the + and - rails of the breadboard, respectively:
The Notecarrier, radar, and camera all need to be secured using some sort of mounting hardware. It's especially important that the camera and radar are mounted on the same plane. That way, they're both pointed in the same direction and the radar's reported speed will correspond to the cars seen by the camera.
The best mounting strategy will vary from situation to situation. For our prototype, we used an acrylic mounting plate from this Sixfab product and a helping hand to allow us to freely position the camera and radar. We used an f-clamp to secure the helping hand to a railing.
We did not bother weatherproofing our prototype, but you will of course want to protect your hardware from the elements if you plan to deploy this project unattended in the field.
The resolution, 120x120, is deliberately low in order to keep the model small enough to run on the board's MCU, but feel free to tune it to your liking.
Click the Start button in the bottom left corner, below the Connect button. At this point, you should see a live stream of images coming in the Frame Buffer window.
Using the Frame Buffer, position your camera so that it's looking at the roadway where you want to detect cars:
In this image, we've used the Zoom button above the Frame Buffer pane to blow up the image, and we've resized the various panes to maximize the Frame Buffer. This view is more pleasant to look at while collecting data than the actual 120x120 image.
Click the New Class Folder button on the left-hand toolbar and enter "car" for the class name.
In the Dataset Editor pane, click car.class. This will enable the Capture Data button in the left-hand toolbar.
When a car enters the shot, click the Capture Data button or use the hotkey Ctrl+Shift+s. This will capture whatever's in the Frame Buffer and save it as a .jpg file in the car.class folder:
We recommend collecting data like this until you've got around 100 images. Using less data than that is likely to result in poor model performance.
Once you've signed in, click "Create new project", enter a project name, and click "Create new project".
From here, follow Edge Impulse's "Detect objects with centroids" tutorial. This tutorial will guide you through labeling your data, specifying the deep learning model, training it, and testing its performance.
If you're unhappy with the performance of the model, we have two suggestions:
Collect more training data.
Try out some of the "Expert mode tips" from this Edge Impulse article. Specifically, we found that increasing the object_weight parameter from 100 to 1000 was extremely helpful in getting better model performance.
To get the note-python files onto the MCU, use the setup_board.py script. First, you must identify the MCU's serial port. On Linux, it'll typically be something like /dev/ttyACM0. You can run ls /dev/ttyACM* before and after plugging the board in to figure out the serial port's associated file. Once you have that, run python setup_board.py <serial port>, replacing <serial port> with your serial port. This script does a few things:
Clones note-python from GitHub.
Creates the /lib and /lib/notecard directories on the MCU.
Copies the .py files from note-python/notecard on your development machine to /lib/notecard on the MCU.
Lists the contents of /lib/notecard so you can verify that everything was copied over.
main.py loops infinitely, grabbing an image from the camera and running it through your model. The output of the model is a probability value in the range [0, 1], with 0 corresponding to a 0% probability of a car being in the image and 1 corresponding to a 100% probability. If this probability exceeds a configurable threshold, the speed of the car is checked using the radar. If the speed exceeds a configurable speed limit, a Note is added to the speeders.qoNotefile in this format:
"time": 1700072210 }
speed is the detected speed, in miles per hour. time is a Unix timestamp indicating when the event occurred. The OpenMV cam board has an RTC, which main.py initializes by fetching the time from the Notecard with card.time. This allows main.py to timestamp events. Note that on power up, the Notecard will take some time to sync with Notehub. Until this sync happens, the time won't be valid, and main.py will keep trying to set the time before proceeding to its detection loop.
At this point, you could copy main.py onto the MCU, reboot the MCU, and the code would start running. However, you won't be able to see the detections (or mis-detections). Instead, use OpenMV IDE to run the code.
Open OpenMV IDE.
Click the Open File icon in the left-hand toolbar and select main.py.
Click the Connect button in the bottom left corner.
Click the Start button below the Connect button.
The code is now running on the MCU, and you'll be able to see what the camera sees in the Frame Buffer. When the model detects a car, the code will draw a circle at the point of the detection in the Frame Buffer.
This is super valuable for evaluating the performance of the model in real time. Additionally, you can monitor serial logs by clicking Serial Terminal at the bottom of the IDE window. If a car is detected and it's going over the speed limit, you'll see logging indicating that a speeders.qo Note was added:
Note that for pyboard.py to work, you'll need to install pyserial with pip install pyserial, if you don't have it installed already. Make sure to replace <serial port> with your serial port. Unplug the camera board's micro USB cable and plug it back in to reboot the device. main.py will start running after boot up.
At this point, the development PC is no longer required, and you can unplug the micro USB cable.
To see the speeders.qo Notes on Notehub, navigate to your project page and click the Events tab on the left-hand side. Here, you should see speeders.qo Notes with their speeds and timestamps in the Body column.