DIY Smart Interactive Eyes: Human Tracking & Blink Animation with ESP32-C5 (Arduino Tutorial)

Discover how to build an engaging Interactive Eyes project with this complete DIY electronics tutorial. This guide provides step-by-step instructions for creating a smart display where animated eyes realistically track human movement, blink, and react to proximity. Learn to integrate the powerful FireBeetle ESP32-C5 microcontroller with a 3.5" TFT Capacitive Touch Display and an advanced Matrix Laser ToF Ranging Sensor. We cover the full hardware connection, Arduino code setup, and libraries (DFRobot_matrixLidarDistanceSensor, DFRobot_GDL) needed to bring this advanced human-computer interaction (HCI) project to life. This is the perfect Arduino project for anyone interested in object tracking, custom animations, and smart hardware.

 

Interactive Eyes Program and User Guide

 

A pair of eyes is displayed on a screen. A matrix laser sensor detects the position and distance of a person (or object). The eyes move left and right, following the person's position, and can also open or close based on the person's distance.

 

 

Hardware Platform

HARDWARE LIST
1 FireBeetle ESP32-C5
1 FireBeetle ESP32-C5-Expansion Board
1 3.5" TFT Capacitive Touch Display
1 Matrix Laser ToF Ranging Sensor

Programming Platform

- Arduino

STEP 1. Hardware Connection

Connection between Main Controller (ESP32-C5) and 3.5" Screen

Connection Method 1 (Recommended: convenient, less space)

- Use an FPC 0.5-18PIN reverse connection cable.

Connection Method 2

- Use DuPont wires.

Main Controller (ESP32-C5 + Expansion Board)---------------------------------3.5" Screen

MOSI(24) ----------------------------------MISO

MISO(25)-----------------------------------MOSI

SCK(23)-----------------------------------SCK

GND-----------------------------------GND

3V3------------------------------------VCC

IO8 ------------------------------------DC

IO27------------------------------------CS

IO26------------------------------------RST

IO15------------------------------------BL

- Connection between Main Controller (ESP32-C5 + Expansion Board) and Matrix Laser Ranging Sensor

Main Controller (ESP32-C5 + Expansion Board) ----------------- Matrix Laser Ranging Sensor

SCL ------------------------------------SCL

SDA------------------------------------SDA

3v3------------------------------------VCC

GND------------------------------------GND

Make sure to double-check the pin numbers and wire orientation when using DuPont wires to avoid reversed connections.

STEP 2. Interactive Eyes Program and Configuration

Developed using Arduino. The configuration is as follows:

1. Open the Arduino IDE. First, install the DFRobot_matrixLidarDistanceSensor library.

- Link: https://github.com/DFRobot/DFRobot_matrixLidarDistanceSensor.git

2. After downloading the .zip file, rename it to remove the -main suffix.

3. In the Arduino IDE, go to Sketch > Include Library > Add .ZIP Library.... Select the renamed .zip file to add it.

4. In the Arduino IDE, go to File > Examples > DFRobot_matrixLidarDistanceSensor. If you can see the examples, the library was imported successfully.

- At this point, you can plug the sensor into the I2C port of the FireBeetle ESP32-C5 expansion board, upload an example sketch, and verify the library is working correctly.

5. Next, install the DFRobot_GDL library. The verification process is the same.

Program Files

- figure.h: Pixel data header file

- screenInteractiveEye.ino: Main program file (.ino)

Program Functionality

- When the matrix distance is < 30 mm: The character is in a "closed eyes" state.

- When the matrix distance is between 30-1500 mm: The character's eyes are in a "tracking" state.

- When the matrix distance is > 1500mm: The character is in a "blinking" state.

STEP 3. Scene Adaptation and Notes

1. Rapid Blinking Issue: During use, if you observe rapid blinking, it might be because an obstacle is detected right at the 1500mm threshold. Sensor data can fluctuate around this value, causing the state to switch rapidly between "blinking" and "tracking". This is normal behavior.

- Solution: Use the device in a more open area or ensure that obstacles are clearly within or outside the 1500mm range.

2. Poor Tracking at a Distance: When interacting from far away, remember the sensor is an 8x8 pixel array. At greater distances, the physical space covered by each pixel "gap" becomes larger. If you use a small object (like your hand) for tracking, you may experience poor tracking performance.

- Solution: Use a larger object for interaction at a distance.

STEP 4. Implementation Principle

Note: The following image assets were generated using AI tools for demonstration purposes.

The full-face image acts as the base canvas. When the program is in the "blinking" state, the eyebrows and eyes are continuously updated by refreshing local pixel data on top of this base canvas, which creates the blink animation.

The eye movement is primarily achieved by continuously erasing and re-writing on a "white of the eye" (sclera) canvas. This way, other pixels in the image do not need to change; we only need to redraw within the eye-white area. By establishing a coordinate system within this eye-white area, we only need to know the target coordinates of the eyeball for each frame.

As you learn and try this project, you can use the assets we provide:

https://github.com/May-DFRobot/DFRobot/blob/master/DIY%20Smart%20Interactive%20Eyes.zip

Principle of Smooth Eye Movement: When the target coordinates change, the movement speed is proportional to the distance. The larger the difference |(x2-x1, y2-y1)| between the current eye position (x1, y1) and the target position (x2, y2), the faster the eyeball moves. Conversely, as it gets closer, it slows down. This achieves a smooth, natural motion.

Implementation Steps

1. Preparing Image Frame Assets

For this project, in addition to the hardware, you must prepare image assets. These assets need to be converted from video footage.

If you don't have suitable source material, you can use AI to generate an appropriate full-face image. With this image, you can use a tool like Kling AI to generate a dynamic video of the person closing their eyes: https://app.klingai.com/global/video-extend/new.

Once you have the video, you can use video editing tools (like Jianying, Adobe Premiere Pro, etc.) to extract the continuous dynamic frames.

File Path: DIY Smart Interactive Eyes\Assets\Character Materials

2. Image Processing

Select a few keyframes from your video and crop/extract the key parts (e.g., eyeball, eyebrow, eye socket). I chose four keyframes for processing.

File Path: DIY Smart Interactive Eyes\Assets\Blink Animation

File Path: DIY Smart Interactive Eyes\Assets\Eyes

File Path: DIY Smart Interactive Eyes\Assets > figure.png

Note:

1. When cropping these image parts, it is crucial to ensure the pixel dimensions (width and height) of the (eyeball, eye socket, eyebrow) images are consistent. These parts change dynamically, and inconsistent pixel sizes will cause visual tearing or artifacts.

2. When cropping, you must record the top-left X-Y coordinates of each crop. You can use a cursor in an image editor to view these coordinates. These coordinates must be saved for use in the program.

Here is a list of the sizes and coordinates you need to save (the eyeball's center point is calculated, not recorded from the crop): (Original document implies a list/table here, but it was not provided in the text. You would insert that list here.)

3. Image Conversion

Next, you must convert the PNG images to the RGB565 format. This step requires a specific conversion tool.

You can use the tool and tutorial from Bilibili creator (爆辣小电匞) at this link: https://www.bilibili.com/video/BV1oe411176s/?vd_source=e791586b6b2a1e5dc8602bdf57db8b18

Convert all the processed images. The results are the Pixel File.

File Path: DIY Smart Interactive Eyes\Assets\Pixel File

In the provided code, I merged all these arrays into a single figure.h file. This figure.h file is generated from the figure.png (the base face) and the other cropped parts.

Note:

1. As mentioned in the "Implementation Principle," we use an "eye-white canvas." Because this is a canvas, its data must be variable (mutable). When creating the figure.h file, the 2D arrays for the left and right eye-whites must be defined as standard arrays. They cannot be declared with PROGMEM or const.

2. All other large image assets (like the base face, the eyeballs, etc.) should be declared with const and PROGMEM. This stores the large data in the program's flash memory, reducing the burden on RAM. I am using the ESP32-C5; if all data were placed in RAM, the compilation would fail due to insufficient memory.

Program Explanation

This section contains the key parameters for the eye cropping dimensions and coordinates. If you change the image assets, you must update the values here.(Original document implies a code snippet showing parameters here.)

Function Descriptions

This program implements an interactive eye display system based on a 3.5-inch screen and a laser matrix distance sensor. Its main function is to detect a user's position via the laser sensor and control a pair of virtual eyes on the screen to perform real-time tracking and expression changes.

- When the user is detected nearby, the eyes will gaze at the user.

- When the user moves far away, the eyes will blink.

- When the user is too close, the eyes will close. The program also achieves smooth eyeball movement and natural blinking animations, creating a vivid human-computer interaction experience.

CODE
void sensorModuleInit()  //Initializes the laser ranging sensor

uint8_t coordinatop(uint8_t initMode) ////Processes the ranging data

void positionLimitation(uint8_t *x,uint8_t *y) //Limits the eyeball's movement range

void eyeSmoothMotion(uint8_t targetX, uint8_t targetY)// Smooth eye motion function

void eyeWhiteRefresh(void)//Refreshes the white area of the eye socket (refreshes canvas content)

void onBaseDraweye(uint8_t initPlace)//Draws the eye on the eye socket (canvas)

//Erases the eye from the eye socket
void erasePaint(int draweye_x,int draweye_y,uint8_t erasureFlag,uint16_t base[43][49],uint8_t oldEye[35][27])

void eyeInteraction()//Main eye interaction logic function

void eyeballRefresh(void)//Eyeball tracking function

void eyeRefresh(uint8_t blink)//Eye refresh function (handles blinking/closing)

/*

*/

void setup()
{
  Serial.begin(115200);
  display.begin();
  display.setRotation(0);

  display.drawPIC(0,0,320 ,480,(uint8_t *)figure);

  sensorModuleInit();
  
  oldTime = millis();
}

void loop()
{
  interactFlag = coordinatop(0);

  if(interactFlag == 0 && oldInteractFlag != 0){
    eyeStates = 1;
  }else if(interactFlag == 1){
    eyeStates = 4;
  }else if(interactFlag == 2 && oldInteractFlag != 2){
    eyeStates = 2;
    blinkFlag = 1;
  }

  oldInteractFlag = interactFlag;

  newTime = millis();
  if(newTime -oldTime >= 3000){

    if(blinkFlag == 0 && eyeStates == 2){
      blinkFlag = 1;
    }
    oldTime = newTime;
  }
  eyeInteraction();
}
icon DIY Smart Interactive Eyes.zip 8.32MB Download(1)
License
All Rights
Reserved
licensBg
0