Creepy animatronic head with Offline Edge AI Face Detection Sensor

I was invited to give a talk about useless machines at the Buenos Aires Book Fair, and I wanted to bring something that would grab the attention of passersby. With only a few days to prepare, ordering parts online was out of the question—so I had to dig into my stash of old motors, boards, and miscellaneous leftovers. The result? An AI-powered animatronic head that uses computer vision to recognize people and responds with some creepy movements. Built entirely from spare parts, it's a weird mix of scrap engineering and artificial intelligence—and it definitely got some looks.

 

Supplies

 

Maniquin head

Doll eyes

Arduino Nano

Gravity: Offline Edge AI Gesture & Face Detection Sensor https://www.dfrobot.com/product-2914.html

1 Relay module

2 Servo motors

2 DVD 5V drive motors

1 3d filament spool

Synthetic hair


 

Circuit

 

Face detection module is connected to SDA, SCL and external power supply.

Head servo to D10 and external power supply

Mouth servo to D9 and external power supply

Relay to D8 and external power supply

Eye motors are connected to power supply gnd and relay NC

Power supply + is connected to relay COM

Power Supply + is connected to Nano VIN and – to Nano GND
 

Face Detection Module


 

The animatronic head only reacts when someone is actually standing in front of it—and no, it's not using a basic PIR or ultrasonic distance sensor. Instead, it's powered by an AI-based person detection module. The best part? You don’t need to mess with machine learning training or any complicated setup. Just hook up two I2C wires, install a library, and you're good to go. The head reliably detects faces, and once a person is identified (with a light turning on for feedback), the same module can even recognize simple hand gestures like a thumbs-up or the "V" sign.


 

Software


 

Download .ino source code https://github.com/ronibandini/AnimatronicHead/tree/main/v2

Install Face detection module library https://github.com/DFRobot/DFRobot_GestureFaceDetection

Upload the code to the Arduino Nano.
 

3d parts
 

I have designed 3 parts: the servo mout for the filament spool, the board supports and the cover for the AI face detection module. Print with PLA.


 

How does it work

 


 

The head stays perfectly still until it detects a person in front of it. At that moment, it comes to life with random movements—turning its head left or right at varying speeds, and occasionally triggering mouth motion. If you make a “V” gesture in front of the sensor, the eyes respond with a sudden, fast movement.

Since this was built specifically for a talk, I chose not to include voice output. That said, adding synced mouth movement with audio is totally doable using an MP3 prompt module or a DFPlayer Mini.

Side note: I forgot to turn the head off at home, and its slow, unprompted rotation ended up creeping out a few people.

 

License
All Rights
Reserved
licensBg
0