This project demonstrates how to control a NeoPixel LED ring using hand gestures detected through a camera. By combining the power of the DFRobot Unihiker SBC, a WS2812-16 RGB LED ring, and a PC/Laptop running Python, you will learn to create a system where hand movements influence LED color and brightness.
This project aims to:
1. Teach how to integrate hand gesture recognition with hardware.
2. Illustrate the use of UDP communication between a PC/Laptop and Unihiker.
3. Inspire creative extensions using Python and hardware.
Connect the Unihiker to the WS2812-16 RGB LED Ring:
1. Ensure the Unihiker is powered off before beginning assembly.
2. Locate the PH2.0-3P interfaces on both the Unihiker (P23) and the LED ring (IN).
3. Use the PH2.0-3P cable to connect the NeoPixel pin on the Unihiker to the LED ring.
4. Ensure a secure connection for stable data transmission.
5. Connect Unihiker (via USB cable) to your PC/Laptop.
Create a folder named HandGestureNeoPixel on your PC/Laptop. Inside this folder, create the following three files:
- requirements.txt
- hand_gesture.py
- main.py
Copy the following provided code into their respective files.
Content of file: requirements.txt
setuptools==75.8.0
wheel==0.45.1
numpy==2.2.1
opencv-python==4.10.0.84
mediapipe==0.10.20
Content of file: main.py
Note: Have a look on constants NEOPIXEL_PIN, NEOPIXEL_NUMBER, INTERFACE_IP4 and UDP_PORT. Adjust the respective values as needed.
from signal import signal, SIGINT
from socket import socket, AF_INET, SOCK_DGRAM
from re import fullmatch
from types import FrameType
from typing import Optional
from pinpong.board import Board, Pin, NeoPixel
NEOPIXEL_PIN: int = Pin.P23
NEOPIXEL_NUMBER: int = 16
INTERFACE_IP4: str = '0.0.0.0'
UDP_PORT: int = 12345
MIN_PX_DISTANCE: int = 50
MAX_PX_DISTANCE: int = 160
DEFAULT_BRIGHTNESS: int = 75
def signal_handler(sig: int, frame: Optional[FrameType]) -> None:
"""
Handles incoming signals and performs necessary cleanup before exiting.
:param sig: Signal number received.
:type sig: int
:param frame: Current stack frame when the signal was caught.
:type frame: Optional[FrameType]
:return: None
"""
_ = frame
print(f'[INFO] Signal handler: {sig} triggered. Exiting...')
raise KeyboardInterrupt
def set_brightness(neopixel: NeoPixel, brightness: int) -> None:
"""
Sets the brightness level of a NeoPixel object.
:param neopixel: An instance of NeoPixel object.
:type neopixel: NeoPixel
:param brightness: An integer representing the brightness, within the range of 0 to 100.
:type brightness: int
:return: None
"""
if 0 <= int(brightness) <= 100:
neopixel.brightness(int(brightness))
def set_led(neopixel: NeoPixel, number: int, led_color: tuple) -> None:
"""
Light up the specified number of LEDs in a NeoPixel object.
:param neopixel: An instance of NeoPixel object.
:type neopixel: NeoPixel
:param number: The number of LEDs to light up.
:type number: int
:param led_color: A tuple representing the RGB color of the LEDs.
:type led_color: tuple
:return: None
"""
if 0 <= number <= NEOPIXEL_NUMBER:
for i in range(NEOPIXEL_NUMBER):
neopixel[i] = led_color if i < number else (0, 0, 0)
def validate_distance_message(message: str) -> Optional[int]:
"""
Validates and extracts the distance value from a message.
:param message: The received message.
:type message: str
:return: The extracted distance as an integer if valid, else None.
:rtype: Optional[int]
"""
match = fullmatch(pattern=r"Distance:(\d+)", string=message)
if match:
return int(match.group(1))
return None
def validate_color_message(message: str) -> Optional[tuple]:
"""
Validates and extracts the RGB color values from a message.
:param message: The received message.
:type message: str
:return: The extracted color as a tuple of integers (R, G, B) if valid, else None.
:rtype: Optional[tuple]
"""
match = fullmatch(pattern=r"Color:\(\s*(\d+)\s*,\s*(\d+)\s*,\s*(\d+)\s*\)", string=message)
if match:
r, g, b = map(int, match.groups())
return b, g, r
return None
def translate_value(number: int) -> int:
"""
Translate the input number into a scaled value based on predefined constants.
:param number: The input value to be translated.
:type number: int
:return: The scaled value after translating the input value using a predefined formula.
:rtype: int
"""
value = max(MIN_PX_DISTANCE, min(MAX_PX_DISTANCE, int(number)))
return int((value - MIN_PX_DISTANCE) * NEOPIXEL_NUMBER / (MAX_PX_DISTANCE - MIN_PX_DISTANCE))
def set_default(neopixel: NeoPixel) -> None:
"""
Sets the default configuration for a NeoPixel object.
:param neopixel: An instance of NeoPixel object.
:type neopixel: NeoPixel
:return: None
"""
set_brightness(neopixel=neopixel, brightness=DEFAULT_BRIGHTNESS)
set_led(neopixel=neopixel, number=0, led_color=(200, 200, 200))
if __name__ == '__main__':
Board("UNIHIKER").begin()
np = NeoPixel(pin_obj=Pin(NEOPIXEL_PIN), num=NEOPIXEL_NUMBER)
set_default(neopixel=np)
distance_number = 0
selected_color = (200, 200, 200)
signal(SIGINT, signal_handler)
sock = socket(AF_INET, SOCK_DGRAM)
sock.bind((INTERFACE_IP4, UDP_PORT))
print(f'[INFO] Listening for UDP messages on interface: {INTERFACE_IP4} port: {UDP_PORT}.')
try:
while True:
try:
data, addr = sock.recvfrom(1024)
udp_message = data.decode('utf-8')
print(f'[INFO] Received message: {udp_message} from IP: {addr[0]} port: {addr[1]}.')
if udp_message.startswith("Distance:"):
distance_value = validate_distance_message(udp_message)
if distance_value is not None:
distance_number = distance_value
else:
print('[WARNING] Invalid Distance message. Skipping...')
continue
elif udp_message.startswith("Color:"):
color_value = validate_color_message(udp_message)
if color_value is not None:
selected_color = color_value
else:
print('[WARNING] Invalid Color message. Skipping...')
continue
else:
print('[WARNING] Invalid message received. Skipping...')
continue
set_led(neopixel=np, number=translate_value(distance_number), led_color=selected_color)
except TimeoutError:
print('[WARNING] No data received. continue looping...')
pass
except ValueError:
print('[ERROR] Malformed message received. Skipping...')
continue
except KeyboardInterrupt:
print('[INFO] Keyboard interrupt detected.')
except Exception as err:
print(f'[ERROR] An unexpected error occurred: {err}')
finally:
print('[INFO] Stopping application...')
set_default(neopixel=np)
sock.close()
Content of file: hand_gesture.py
Note: Have a look on constants TARGET_IP and TARGET_PORT. Adjust the respective values as needed.
from sys import exit
from math import sqrt
from socket import socket, AF_INET, SOCK_DGRAM
from typing import Any, Optional
import numpy as np
import mediapipe as mp
import cv2
TARGET_IP: str = '10.1.2.3'
TARGET_PORT: int = 12345
WINDOW_NAME: str = 'Hand Gesture Detection'
VIDEO_FPS: int = 30
BLACK: tuple = (0, 0, 0)
COLORS: dict = {
'white': [(255, 255, 255), (100, 100), 30],
'red': [(255, 0, 0), (200, 100), 30],
'green': [(0, 255, 0), (300, 100), 30],
'blue': [(0, 0, 255), (400, 100), 30]
}
def send_udp_message(message: str) -> None:
"""
Sends a UDP message to a specified target IP and port.
:param message: The string message to be sent over UDP.
:type message: str
:return: None
"""
sock = socket(AF_INET, SOCK_DGRAM)
sock.sendto(message.encode('utf-8'), (TARGET_IP, TARGET_PORT))
sock.close()
def calculate_distance(point_1: tuple, point_2: tuple) -> float:
"""
Calculates the Euclidean distance between two points in a 2D space.
:param point_1: A tuple representing the (x, y) coordinates of the first point.
:type point_1: tuple
:param point_2: A tuple representing the (x, y) coordinates of the second point.
:type point_2: tuple
:return: The Euclidean distance between point1 and point2 as a float.
:rtype: float
"""
return sqrt((point_2[0] - point_1[0]) ** 2 + (point_2[1] - point_1[1]) ** 2)
def draw_color_selection(video_frame: np.ndarray, color: str, selected: tuple) -> None:
"""
Draws a color selection indicator on the given video frame by placing a circle
of the specified color at a predefined location.
:param video_frame: The video frame on which the circle is drawn.
:type video_frame: np.ndarray
:param color: The key representing the desired color from the COLORS dictionary.
:type color: str
:param selected: A tuple representing the currently selected color.
:type selected: tuple
:return: None
"""
cv2.circle(video_frame, COLORS[color][1], COLORS[color][2], COLORS[color][0], -1)
if selected == COLORS[color][0]:
cv2.circle(video_frame, COLORS[color][1], COLORS[color][2], BLACK, 2)
def is_finger_touching_circle(finger_coords: tuple[int, int], center: tuple[int, int], radius: int) -> bool:
"""
Determines whether a given point, representing finger coordinates, lies within
or on the boundary of a circle with a specified center and radius.
:param finger_coords: A tuple of integers representing the x and y coordinates of the finger.
:type finger_coords: tuple[int, int]
:param center: A tuple of integers representing the x and y coordinates of the circle's center.
:type center: tuple[int, int]
:param radius: An integer representing the radius of the circle.
:type radius: int
:return: A boolean value indicating whether the finger coordinates are within or on the boundary of the circle.
:rtype: bool
"""
distance = sqrt((finger_coords[0] - center[0]) ** 2 + (finger_coords[1] - center[1]) ** 2)
return distance <= radius
def finger_detect(video_frame: np.ndarray, detection: Optional[Any], selected: tuple) -> tuple:
"""
Detect and process finger interactions in a video frame using Mediapipe hand landmarks.
:param video_frame: A video frame represented as a numpy array.
:type video_frame: np.ndarray
:param detection: Object containing the result of a Mediapipe hands detection.
:type detection: Optional[Any]
:param selected: A tuple representing the currently selected color.
:type selected: tuple
:return: Updated selected color after detecting interactions with the left-hand index finger.
:rtype: tuple
"""
if detection.multi_hand_landmarks and detection.multi_handedness:
for idx, hand_landmarks in enumerate(detection.multi_hand_landmarks):
hand_label = detection.multi_handedness[idx].classification[0].label
if hand_label == "Right":
thumb_tip = hand_landmarks.landmark[mp_hands.HandLandmark.THUMB_TIP]
index_finger_tip = hand_landmarks.landmark[mp_hands.HandLandmark.INDEX_FINGER_TIP]
h, w, _ = video_frame.shape
thumb_tip_coords = (int(thumb_tip.x * w), int(thumb_tip.y * h))
index_finger_tip_coords = (int(index_finger_tip.x * w), int(index_finger_tip.y * h))
distance = calculate_distance(thumb_tip_coords, index_finger_tip_coords)
cv2.circle(video_frame, thumb_tip_coords, 5, selected, -1)
cv2.circle(video_frame, index_finger_tip_coords, 5, selected, -1)
cv2.line(video_frame, thumb_tip_coords, index_finger_tip_coords, BLACK, 2)
text = f'Distance: {int(distance)} px'
cv2.putText(video_frame, text, (thumb_tip_coords[0], thumb_tip_coords[1] - 10),
cv2.FONT_HERSHEY_SIMPLEX, 0.5, COLORS['white'][0], 2)
send_udp_message(message=f'Distance:{int(distance)}')
if hand_label == "Left":
index_finger_tip = hand_landmarks.landmark[mp_hands.HandLandmark.INDEX_FINGER_TIP]
h, w, _ = video_frame.shape
index_finger_tip_coords = (int(index_finger_tip.x * w), int(index_finger_tip.y * h))
cv2.circle(video_frame, index_finger_tip_coords, 10, selected, -1)
for i in COLORS.keys():
if is_finger_touching_circle(index_finger_tip_coords, COLORS[i][1], COLORS[i][2]):
selected = COLORS[i][0]
send_udp_message(message=f'Color:{selected}')
return selected
if __name__ == "__main__":
camera = cv2.VideoCapture(0)
camera.set(cv2.CAP_PROP_FPS, VIDEO_FPS)
if not camera.isOpened():
print("[ERROR] Unable to open camera.")
exit(1)
selected_color = (255, 255, 255)
mp_hands = mp.solutions.hands
hands = mp_hands.Hands(static_image_mode=False,
max_num_hands=2,
min_detection_confidence=0.75,
min_tracking_confidence=0.5)
mp_drawing = mp.solutions.drawing_utils
print("[INFO] Press key q or ESC to quit.")
while True:
ret, frame = camera.read()
if not ret:
print("[ERROR] Cannot read frame")
break
if frame is None or frame.size == 0:
print("[WARNING] Empty frame. Skipping...")
continue
frame = cv2.flip(frame, flipCode=1)
rgb_frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
results = hands.process(rgb_frame)
for item in COLORS.keys():
draw_color_selection(frame, item, selected_color)
selected_color = finger_detect(frame, results, selected_color)
cv2.imshow(WINDOW_NAME, frame)
key = cv2.waitKey(1) & 0xFF
if key == ord('q') or key == 27:
break
camera.release()
cv2.destroyAllWindows()
Explanation of Python Code Components
UDP Socket Communication
Why UDP?: UDP is used for fast, connectionless communication between the PC/Laptop and the Unihiker. It is well-suited for transmitting real-time data such as distances and colors.
Implementation: In main.py, the Python socket library is used to create a UDP socket on the Unihiker that listens for incoming messages. Messages are decoded and parsed to control the LED ring.
Example: The function sock.recvfrom(1024) receives data, while the send_udp_message function in hand_gesture.py sends UDP packets to the Unihiker containing the distance or color.
MediaPipe and OpenCV for Computer Vision
MediaPipe Hands: This library detects hand landmarks and classifies whether the hand is left or right. It provides precise coordinates for key points like fingertips.
OpenCV: Used to capture video frames from the camera, process them for gesture detection, and display results in a real-time window.
Integration: MediaPipe processes each video frame to identify hand gestures, while OpenCV handles drawing and visual feedback. Detected gestures are mapped to specific actions like sending distance or color updates.
NeoPixel LED Ring Control
Purpose: The NeoPixel library allows precise control of the LED ring, including brightness and individual LED colors.
Functions: Functions like set_brightness and set_led in main.py are used to adjust LED properties dynamically based on received messages.
Message Parsing: Messages such as Distance:100 or Color:(255,0,0) are validated, parsed, and translated into LED patterns using helper functions.
Python Packages:
- socket: Facilitates communication between the PC/Laptop and the Unihiker via UDP.
- mediapipe: Provides machine learning solutions for real-time gesture detection.
- opencv-python: Handles video capture and frame processing.
- numpy: Used for efficient numerical computations (e.g., color and coordinate processing).
- math: Computes distances between points (e.g., fingertips) for gesture-based actions.
Install Python 3.8 or later on your PC/Laptop. The operating system on the PC/Laptop does not matter. You can download Python from https://www.python.org.
Ensure the requirements.txt file is in the same directory and all dependencies are installed. It is a good practice to use a Python virtual environment to avoid conflicts between dependencies. This ensures the project dependencies are isolated from the global Python environment. To set up and activate a virtual environment:
# create Python virtualenv
$ python3 -m venv venv
# activate virtualenv
$ source venv/bin/activate
# install required modules and packages
$ pip3 install -r requirements.txt
# verify Python packages (optional)
$ pip3 freeze
The good news. The Unihiker comes with pre-installed Python packages, so no additional setup is required. You can choose the protocol to upload the folder (HandGestureNeoPixel) and file (main.py) onto Unihiker. For example FTP, SMB or SCP.
Note: The default username is root, and the default password is dfrobot. For more details on uploading code to the Unihiker, refer to the Online Documents.
Here the example for SCP:
# copy folder and files to Unihiker
$ scp -r HandGestureNeoPixel/ [email protected]:/root/
# verify upload (optional)
$ ssh [email protected] -C 'ls -la /root/HandGestureNeoPixel/'
# delete unused files from Unihiker (optional)
$ ssh [email protected] -C 'rm /root/HandGestureNeoPixel/requirements.txt && rm /root/HandGestureNeoPixel/hand_gesture.py'
Running the Code on the Unihiker
You can start the Python code via Touch Display or command line. Here the command line example:
# SSH into Unihiker
$ ssh [email protected]
# change directory (optional)
$ cd /root/HandGestureNeoPixel/
# execute Python script
$ python3 /root/HandGestureNeoPixel/main.py
Running the Code on the PC/Laptop
This step is assuming the virtualenv is started and all Python modules/packages are installed (as well as you are in correct directory)!
# execute Python script
$ python3 hand_gesture.py
If you have done everything successfully, you should see a new window on your PC/laptop and can control the colors and LEDs with your hands.
The LED ring changes the number of LEDs (depending on your finger distance - right hand) and the colors (depending on what you have selected - left hand).
1. Use MediaPipe to detect facial expressions and trigger corresponding LED patterns or other outputs.
2. Utilize hand gestures to control multiple sensors and actuators on the Unihiker, such as temperature sensors, motors, or servo mechanisms.
3. Create an interactive LED wall that responds to hand gestures with animations and colors.
4. Use gesture recognition to direct collaborative robots in industrial or service tasks.
This tutorial has outlined the steps to create and deploy a gesture-controlled LED ring project. With these foundations, you can explore creative ways to use Python and hardware in tandem.