TinyML: Building an Intelligent Bird Identification Recording Device with Edge Impulse
1.Project Introduction
1.1 Project Overview
While strolling through the mountains and forests, a bird flashes by, and you wish you knew its name but can only sigh in frustration? Traditional birdwatching relies on flipping through field guides, which is inefficient and makes you prone to missing out on wonderful moments.

Now, here comes the intelligent bird identification recording device built on UNIHIKER K10! With the self-training object detection model achieved through the Edge Impulse platform and combined with TinyML technology, the device can automatically identify birds in real-time, frame their positions. You can quickly check the statistics of the number and categories of birds you've observed, making birdwatching smarter and more convenient, and opening up a brand-new experience for every birdwatching enthusiast!
1.2 Project Functional Diagrams

1.3 Project Video
Ā
2.Materials List
Ā
2.1 Hardware list
Ā

Ā
2.2 Software
Ā
(1)Arduino IDE 1.8.19
Ā
Download link: https://www.unihiker.com/wiki/K10/GettingStarted/gettingstarted_arduinoide/
Ā

Ā
Note: UNIHIKER K10 currently only supports Arduino IDE 1.8.19 or below. We will add support for Arduino IDE V2.x in subsequent releases.
Ā
(2)Mind+ Graphical Programming Software (Minimum Version Requirement: V1.8.1 RC1.0)
Ā

Ā
2.3 Basic Software Usage-UNIHIKER K10 with Arduino IDE
First, we need to install the Unihiker SDK in the Arduino IDE.
Open the Arduino IDE, go to "File -> Preference" in the Arduino IDE, and set "Compiler warnings" to none.
Ā

Ā
As is shown below, click here to add SDK URL, Copy and paste the URL in the box, then click OK to save.
SDK URL Link: https://downloadcd.dfrobot.com.cn/UNIHIKER/package_unihiker_index.json
Ā

Ā
Then, open "Tools->Board->Boards Manager"
Ā

Ā
Search unihiker and click the "install " button to install the UNIHIKER K10 SDK.
Ā

Ā
UNIHIKER K10 will appear in your Arduino IDE
Ā

Ā
Note: For more instructions on using the Unihiker K10 via the Arduino IDE, please refer to the official wiki: https://www.unihiker.com/wiki/K10/GettingStarted/gettingstarted_arduinoide/
Ā
Note:The Arduino IDE SDK for UNIHIKER K10 is still in beta. There may be some API bugs. if you find a bug, you can send an email to the UNIHIKER team and we will fix it as soon as possible. Email: [email protected] We will update to the official version in the near future.
Ā
3.Construction Steps
Now, let's start building this project. It consists of three main tasks:
(1) Connect the computer and UNIHIKER K10 using Mind+ and Webcam Library, and collect and save bird image datasets.
(2)Annotate data and train the object detection model on the edge impulse platform.
(3)Deploy the trained model to Unihiker K10 via Arduino IDE to complete the intelligent bird identification device.
3.1 Task1: Collect and save bird image datasets via UNIHIKER K10
(1) Software and Hardware Preparation
Double click to open the Mind+ļ¼the following screen will be called up.

Click and switch to offline mode.

Based on the previous steps, then click on "Extensions" find the "UNIHIKER K10" module under the "Board" and click to add it. After clicking "Back" you can find the UNIHIKER K10 in the "Command Area" and complete the loading of UNIHIKER K10.

Then,you need to use a USB cable to connect the UNIHIKER K10 to the computer.

Then, after clicking Connect Device, click COM7-UNIHIKER K10 to connect.

Note: The device name of different UNIHIKER K10 may vary, but all end with K10.
(2) Turn on the webcam through graphical programming
In the programming interface of Mind+, switch to the "Graphical Programming" mode. First, click the "Extensions" button at the bottom of the software. In the pop-up extension library list, find the "Internet" category, expand it, locate the "WIFI" library, and click to load it. This will enable network connectivity for Unihiker K10.

Next, find the "User-Ext" (User Extensions) option. Click to enter the custom extension library addition interface. In the input box, type "https://github.com/YeezB/K10webCam", then click the "Load" button. The system will automatically download and load the K10webcam library, which enables the webcam network transmission function of Unihiker K10.

After the library is loaded, start building the program. Drag the following blocks from the left module area to the programming workspace,as is shown below.The webcam equipped on Unihiker K10, combined with its powerful networking capabilities, can transmit the camera feed to the local area network in real-time, allowing any computer within the LAN to access the footage conveniently. This provides great flexibility for collecting bird images.

After completing the program, click the "Upload" button on the toolbar to transfer the program to Unihiker K10. Please wait patiently during the upload process until the software indicates completion.

Once uploaded, the screen of Unihiker K10 will display an IP address similar to "192.168.9.177", which is the device's address on the local network. Ensure your computer is connected to the same WiFi network as Unihiker K10. Open a web browser (such as Chrome or Firefox) on your computer, type the displayed IP address into the address bar, and press Enter.

The browser will open a web interfacećOn the web interface, there will be a "Capture" button. When you observe a image that needs to be collected in the frame, click this button. The system will automatically save the current frame on your computer
(3) Collect and save bird images
Hold up Unihiker K10 so that the camera is aimed at the bird, ensuring the bird is fully visible and clear in the frame. On the webcam interface, you will see a "Take a photo and download it" button to capture and save the image to your computer. When you spot a bird to be collected, press this button. The system will automatically capture the current image and save it to your computer.Take approximately 30-50 photos for each bird species.

Organize all images into a folder named "bird_dataset" for subsequent model training.

3.2 Task2: Annotate data and train the object detection model
(1) Upload the image dataset to the Edge Impulse platform.
Log in to the Edge Impulse official website at: https://edgeimpulse.com/
Note: Please complete website user registration on your own.
Click "Create a new project" and enter the project name, such as "birds_detection".

Click "Add existing data", then select "Upload data".

Click "select a folder", find and click the "choose files"button, and find the "bird_dataset" folder that we collected and organized earlier., then click "Upload"ć

Select "Automatically split between training and testing.Finally, click "Upload data" to start uploading the dataset.

(2) Data annotation
After the data upload is complete, click "Data acquisition -> Labeling queue" to enter the data annotation page.On the annotation page,use the mouse to draw a rectangular box around the bird in the image to completely enclose the bird.After drawing the box, enter the name of the bird (such as "blackbird", "pigeon", etc.) in the pop-up label input box, then click "Add" to add the label.If there are multiple birds or different bird species in one image, repeat steps 2-3 to add labels for each bird separately.

After completing the annotation, click the "Next" button to switch to the next image and continue annotating until all images are annotated.

(3) Training of the bird identification model.
After completing the dataset annotation, click "Impulse design -> Create impulse"ļ¼click "save impulse" button.

Reference for the meaning of each processing block https://docs.edgeimpulse.com/docs/edge-impulse-studio/processing-blocks
After saving, click the "Image" processing block to enter the "data parameters" page,select"save paramenters".

Enter the "Generate features" page and click the "Generate features" button to generate image features.

Click "Object detection" to enter the model training page. On the training page, we can set model training-related hyperparameters such as the number of training epochs, learning rate, etc. Here, set the number of training epochs to 100 and the learning rate to 0.001. Then, select the model framework ā here we choose the FOMO MobileNetV2 framework ā and click the "Save & train" button to start training the model. When training is complete, the training results and related data can be viewed on the right side.

After training, you can view the model performance. If the performance is unsatisfactory, go to the "Retrain model" page, select "Train model", adjust the parameters, and retrain the model.

3.3 Task3: Deploy and apply the bird identification model on UNIHIKER K10
(1) Depoly the bird identification model
After the model training is complete, click "Deployment" to enter the model download page. Click "DEFAULT DEPLOYMENT", select the "Arduino library" option, choose "TensorFlow Lite" for "MODEL OPTIMIZATIONS", and then click "build".

When the model build is complete, the packaged model compressed file will be downloaded automatically.

Unzip the downloaded model file into the "arduino->libraries" path of Arduino IDE (1.8.19), as shown in the figure below.

Open the birds_detection_inferencing folder and replace the depthwise_conv.cpp and conv.cpp files in src\edge-impulse-sdk\tensorflow\lite\micro\kernels of the model file. (The replacement files are provided in the appendix at the end.)

Click this link: https://github.com/cdjq/edgeImpulse_vision to download the edgeImpulse_vision library adapted for UNIHIKER K10.

Unzip the edgeImpulse_vision-main folder into the arduino-1.8.19\libraries directory.

(2) Import the bird identification model
Locate the "user_include" file under the path \arduino-1.8.19\libraries\edgeImpulse_vision-main\src and double-click to open it.

Replace #include in the "user_include" file with #include .

(3) Write inference code with Arduino IDE
Use a USB data cable to connect the computer to the UNIHIKER K10.

Double-click Arduino IDE. Click "File" in the top-left corner, select "Open", and locate the edgeimpulse.ino file under the path arduino-1.8.19\libraries\edgeImpulse_vision-main\example\edgeimpulse.

After opening, the page display as follows.

Copy the following code into the file. Please refer to the comments to understand the code functionality.
#include "unihiker_k10.h"
#include "edgeImpulse_vision.h"
// Initialize hardware and vision processing objects
UNIHIKER_K10 k10;
edgelmpulse_vision edfelmpulse;
// Volatile variables to store counts of detected bird species
volatile float mind_n_blackbird, mind_n_turtledove, mind_n_pigeon;
#define MAX_DETECTIONS 10 // Maximum number of objects to detect per frame
sEdgeData detections[MAX_DETECTIONS]; // Array to store detection results
uint8_t screen_dir = 2; // Screen orientation setting
// Function prototype for button B press callback
void onButtonBPressed();
/**
* Setup function - Initializes hardware components and variables
*/
void setup() {
k10.begin(); // Initialize UNIHIKER_K10 board
Serial.begin(9600); // Initialize serial communication
k10.initScreen(screen_dir); // Set screen orientation
k10.initBgCamerImage(); // Initialize background camera image
k10.setBgCamerImage(false); // Disable background camera image initially
k10.creatCanvas(); // Create drawing canvas
//k10.setBgCamerImage(true); // Enable background camera image (commented out)
// Register callback function for button B press event
k10.buttonB->setPressedCallback(onButtonBPressed);
k10.setBgCamerImage(true); // Enable background camera image
// Initialize bird counters
mind_n_blackbird = 0;
mind_n_turtledove = 0;
mind_n_pigeon = 0;
}
/**
* Main loop - Continuously processes camera input and handles user interactions
*/
void loop() {
edfelmpulse.request(); // Request object detection from Edge Impulse
k10.canvas->canvasClear(); // Clear canvas to avoid overlapping drawings
// Retrieve detected objects
int num_detected = edfelmpulse.getAllDetections(detections, MAX_DETECTIONS);
if (num_detected > 0) {
// Process each detected object
for (int i = 0; i < num_detected; i++) {
sEdgeData& carData = detections[i];
// Print detection details to serial monitor
Serial.print(carData.label);
Serial.print(", ");
Serial.print(carData.value);
Serial.print(", x:");
Serial.print(carData.x);
Serial.print(", y:");
Serial.print(carData.y);
Serial.print(", width:");
Serial.print(carData.width);
Serial.print(", height:");
Serial.println(carData.height);
// Draw bounding box (red border, blue interior)
// Note: Coordinates adjusted by +80, potential offset correction needed
k10.canvas->canvasRectangle(carData.x, carData.y + 80, carData.width + 80, carData.height + 80, 0xFF6666, 0x0000FF, false);
// Draw object label (blue text)
k10.canvas->canvasText(carData.label, carData.x, carData.y -20, 0x0000FF, k10.canvas->eCNAndENFont24, 50, false);
}
} else {
// No objects detected
Serial.println("No objects found");
k10.canvas->canvasText("No objects", 10, 10, 0x0000FF, k10.canvas->eCNAndENFont24, 50, false);
}
k10.canvas->updateCanvas(); // Update display with new drawings
// Handle button A press
if ((k10.buttonA->isPressed())) {
// Recheck detections for counting
int num_detected = edfelmpulse.getAllDetections(detections, MAX_DETECTIONS);
bool updated = false;
// Count detected bird species
for (int i = 0; i < num_detected; i++) {
sEdgeData& obj = detections[i];
if (String(obj.label) == "blackbird") {
mind_n_blackbird += 1;
updated = true;
}
if (String(obj.label) == "turtledove") {
mind_n_turtledove += 1;
updated = true;
}
if (String(obj.label) == "pigeon") {
mind_n_pigeon += 1;
updated = true;
}
}
// Show confirmation message if counts were updated
if (updated) {
k10.canvas->canvasClear();
k10.canvas->canvasText("recorded", 10, 10, 0x00AA00, k10.canvas->eCNAndENFont24, 50, false);
k10.canvas->updateCanvas();
delay(1000); // Display message for 1 second
k10.canvas->canvasClear();
k10.canvas->updateCanvas();
}
}
delay(100); // Control refresh rate (10 FPS)
}
/**
* Callback function for button B press event
* Displays counted bird species for 5 seconds
*/
void onButtonBPressed() {
// Display counts for each bird species
k10.canvas->canvasText((String("pigeonļ¼") + String(mind_n_pigeon)), 0, 0, 0x0000FF, k10.canvas->eCNAndENFont24, 50, false);
k10.canvas->canvasText((String("turtledove") + String(mind_n_turtledove)), 0, 30, 0x0000FF, k10.canvas->eCNAndENFont24, 50, false);
k10.canvas->canvasText((String("blackbird") + String(mind_n_blackbird)), 0, 60, 0x0000FF, k10.canvas->eCNAndENFont24, 50, false);
k10.canvas->updateCanvas();
delay(5000); // Display for 5 seconds
k10.canvas->canvasClear(); // Clear canvas after timeout
}
Note: For more instructions on using the Unihiker K10 via the Arduino IDE, please refer to the official wiki: https://www.unihiker.com/wiki/K10/GettingStarted/gettingstarted_arduinoide/
4.Upload the Program and Observe the Effect.
Keep the computer connected to the UNIHIKER K10. Click "Tool", set "Board" to "unihiker k10", "Upload Speed" to "921600", and enable "USB CDC on Boot".

Then click the "Upload" button to compile and upload the code to the UNIHIKER K10.


Note: The first compilation of Edge Impulse-related programs on the UNIHIKER K10 may take longer. Please wait patiently.
When the upload is successful, the page will display as follows.

Pick up the UNIHIKER K10 and point its onboard camera at bird images. Observe the recognition results on the UNIHIKER K10 screen, which will display the bird names and outline the birds' positions in the image with bounding boxes.

Press and hold the A button on the UNIHIKER K10 for 1 second to record the observed bird name into the device. When you release the button, the screen will display "recorded" to indicate successful recording.

Press the B button to view all identified bird species and their corresponding counts.

Click "Serial Monitor" in "Tool" to view the bird recognition results of the UNIHIKER K10 in real time.

5. Knowledge Hub
5.1 What is Object Detection?
Object detection is a core technology in the field of computer vision. It can identify the positions of specific objects in images or videos, mark them with bounding boxes, and assign corresponding category labels to each object.

Different from simple image classification, object detection can answer not only "What is this?" but also "Where is it?". In our bird identification project, object detection technology enables the device to automatically locate and distinguish different birds in the frame.
5.2 What is TinyML?
TinyML (Tiny Machine Learning) is a technology that deploys machine learning models to small resource-constrained devices such as microcontrollers and sensors.

Different from traditional AI that requires powerful computing resources, TinyML enables lightweight models to run efficiently on edge devices through model compression, quantization, and optimization, achieving local real-time processing and significantly reducing latency and power consumption. In our project, TinyML technology allows Unihiker K10 to quickly identify birds without internet connection, truly realizing "edge intelligence".









