1 Project Introduction
Driving always brings some distracting moments, such as: when you just get on the highway, your phone suddenly rings. Reaching for your phone risks drifting off course, but ignoring it might mean missing an emergency. When entering a tunnel in the evening, you want to brighten the interior lights but can’t reach the adjustment button—trying to do this while driving feels unsafe. This project will create a car assistant to solve these small troubles: it doesn’t need you to touch any buttons, but lets you operate through hand gestures. For example, when your phone rings, you can make an "OK" sign in front of the dashboard instead of looking down for the phone; the call will be answered immediately, ensuring safety and convenience. At night, to create a soft atmosphere, gently move your hand to the left — the lights will dim, so you don’t have to fumble for the knob.
What’s more practical is that this little assistant can be expanded according to your needs: to adjust the air conditioner, extend your index finger upward to increase temperature by 1 degree; make a fist to decrease wind speed—soon you’ll be able to control temperature with gestures. To skip songs, set to "palm swiping left/right to switch to previous/next song" and you won’t need to press the music buttons anymore. From target object data collection, incremental model training, and inference, to data calculation and command execution, this complete process shows how AI technology can be easily implemented in daily life, making hand gesture recognition touchable, interactive, and more fun!
Demo Video
2. Project Implementation Principle
Its core logic is as follows: First, the HuskyLens 2 camera collects and learns gesture data, recording the features of target gestures such as "OK" into the system through the process of "data updating → incremental learning". When you need to answer a phone call while driving, the HuskyLens 2 will capture your gestures in real-time via the camera, then perform model inference to recognize the gesture type. Next, the UNIHIKER K10 will receive the recognition result from the HuskyLens 2 and perform calculations: If it's an "OK" gesture, it will turn the LED strip to green (imitating phone answering); If the recognized gesture is for adjusting the light, it will calculate the corresponding light intensity based on your hand's X position and adjust the light's brightness accordingly.

3. Hardware and Software Preparation
3.1 Equipment List
Note: HuskyLens requires Version HuskyLens 2.
3.2 Hardware Connection
Make connections by referring to the diagram below.

3.3 Software Preparation
Download and install the Mind+ installation package (Version 2 or above) from the official website. Double-click to open it after installation.

4. Project Making
4.1 HuskyLens 2
First, select the protocol type for HuskyLens 2.
Tap System Settings -> Protocol Type -> Select I2C communication mode, then return to the main menu interface.

Second, swipe the screen to find the "Hand Recognition" function.

When HuskyLens 2 is pointed at a palm-containing view, upon detecting a palm, the screen will display a white box enclosing all palm(s) within the image, with 21 key points marked on each palm using white dots.

After configuring the parameters and selecting the mode in HuskyLens 2, the next step is to train the model to recognize the "OK" gesture for triggering phone answer operations. The steps are as follows:
Align HuskyLens 2 with the palm to be learned, adjust the viewing angle so that the plus sign in the center of the screen is within the white box, then press the top-right Button A to learn this gesture.

the plus sign in the center of the screen is within the white box, then press the top-right Button A to learn this gesture.
Once learned, when the pre-learned gesture is detected, the screen will enclose it with a colored box, and display the gesture name, ID number, and confidence level at the top, for example: "Gesture: ID177%".

For more detailed usage of HuskyLens 2, please refer to the following URL:
https://wiki.dfrobot.com/_SKU_SEN0638_Gravity_HUSKYLENS_2_AI_Camera_Vision_Sensor
After learning the hand gesture in HuskyLens 2, you can start writing code to implement a vehicle smart assistant based on hand recognition.
4.2 Programming
Open the programming software Mind+, choose "Coding" mode, then click "Upload" to create a new project.

Next, add the required extensions in Mind+, including Arduino UNO, HuskyLens 2 and LED Strip.
Enter the "Extensions" page, switch to "Board" tab, search for"K10", and click "UNIHIKER K10".



Load the "HuskyLens 2 AI Camera" and "LED Strip" library by the same way from "Module" page.


Click the "Back" button to return to the programming interface.

Click the "Connect Devices", choose your device and "Connect".




After the device is successfully connected, write the program as follows:

The analysis of the core code is as follows:

There is a complete program file for this project in the attachment. (Note: The .mp file is compatible with both Mind+ v1.x (e.g., v1.8.1) and v2.x (e.g., v2.0), while the .mpcode file only works with Mind+ v2.0)
Open Project->Open Local File to load project.


Select the project in the attachment and click "Open".

Click "Upload" to run the program.

The effect is as follows:

5. Attachment









