AI Visual Recognition Guides and Projects

Welcome to our guide on AI visual recognition. This collection of project tutorials spans a wide range of applications. From whimsical endeavors that bring our favorite fictional universes to life, to practical solutions for everyday challenges, these projects cover a diverse array of topics. You'll find tutorials on creating interactive games, building autonomous robots, and developing intelligent IoT devices, among others. Each project harnesses the power of AI to recognize, analyze, and respond to visual data in innovative ways. Let's explore the exciting world of AI visual recognition together!

Project 1. Create a Real-Life Sorting Hat from Harry Potter with the Power of HUSKYLENS

Introduction: Welcome to the magical world of Hogwarts, brought to life with HUSKYLENS and micro:bit technology. This project, showcased at BETT 2023 London, uses HUSKYLENS' face recognition algorithm to identify faces and assign them to one of the four legendary Hogwarts houses. The micro:bit illuminates an LED ring with a color corresponding to the assigned house. If an unfamiliar face is detected, the Sorting Hat randomly assigns it to a house, ensuring a unique and exciting experience each time. The project includes a detailed guide on setting up the hardware and programming the micro:bit.

→Learn More

 

 

 

Project 2. Stardew Valley Villager Recognition and Gift Preferences Bot w/ OpenCV

Introduction: Enhance your Stardew Valley gaming experience with this innovative project that uses computer vision to recognize villagers and display their gift preferences. Utilizing a LattePanda Alpha 864s, a high-performance single-board computer with an embedded Arduino Leonardo, the project employs OpenCV template matching for villager recognition. The detected villager's information, including birthdays and gift preferences, is displayed on a compact 3.5" TFT LCD touch screen.

→Learn More

 

 

 

Project 3. DIY a Pocket Monsters Discriminator

Introduction: Embark on a journey with the DIY Pocket Monsters Discriminator, a project that utilizes the object classification function of HuskyLens. This project uses a micro:bit for wireless transmission and the real-time mode of mind+ for simulation. HuskyLens learns to identify different pocket monsters through online images. The program logic is simple: when HuskyLens identifies different ID numbers, it sends corresponding numbers to the micro:bit via wireless communication. The receiver then displays different images based on the received data.

→Learn More

 

 

 

Project 4. Wireless UNIHIKER, Remote Control a Face Tracking Mecanum Wheeled Platform

Introduction: Experience the magic of IoT prototyping with the UNIHIKER Board and Mecanum Wheeled platform. This project allows remote control of a mobile platform through network-based facial recognition. The UNIHIKER Board, equipped with a touchscreen, Wi-Fi, Bluetooth, and various sensors, interfaces with external devices through a co-processor. The Mecanum Wheel, with its unique design, enables movement in any direction without rotating the chassis. The project also utilizes OpenCV for image processing and computer vision algorithms.

→Learn More

 

 

 

Project 5. DIY Line Tracking Robot with HuskyLens and Romeo

Introduction: Discover the power of line tracking with the HuskyLens and Devastator tank mobile robot. This project demonstrates the use of HuskyLens' object line feature to control the robot's line tracking function. The HuskyLens learns and remembers the features of an object line, eliminating the need for constant parameter adjustments. The project includes a detailed guide on assembling the Devastator tank, wiring, testing the robot, installing HuskyLens, and setting up I2C communication between the ROMEO board and HuskyLens.

→Learn More

 

 

 

Project 6. Line Follower using Image Processing

Introduction: This tutorial guides through the process of building a line-following robot using a camera. The hardware components include an Arduino Nano, L293D Motor driver, Robot Chassis, BO Motor 100RPM x 2, a 12V/9V Battery, and Jumper cables. The software required is the Arduino IDE and HuskyLens Library

→Learn More

 

 

 

Project 7. How to Make a Fruit Classification Project with UNIHIKER

Introduction: This tutorial demonstrates how to use the UNIHIKER device for a fruit classification project. The process involves collecting images of different fruits, training a model with these images, and using the model for fruit classification. The results are displayed using LED lights.

→Learn More

 

 

 

Project 8. Vegetables and Fruits Ripeness Detection by Color w/ TensorFlow

Introduction: This project involves developing a device that uses an artificial neural network to detect the ripening stages of fruits and vegetables by spectral color. The device uses an AS7341 visible light sensor and an Arduino Nano 33 IoT to collect spectral color data. The data is then used to train a TensorFlow model, which can predict ripening stages. The project includes a web application for data collection and model testing.

→Learn More

 

 

 

Project 9. Robot Arm controlled by HuskyLens

Introduction: This project involves controlling a robot arm using the face recognition capabilities of HuskyLens. The HuskyLens device can learn and identify multiple faces, with each face assigned a unique ID. The center coordinates of three recognized faces are transmitted to an Arduino, which calculates the angle from these points and moves the servo accordingly.

→Learn More

 

 

 

Project 10. AI-driven IoT 3D Printer Motion & Status Tracker w/ Telegram

Introduction: This project involves creating a device to monitor the movements of a 3D printer and detect potential malfunctions. The device uses a HuskyLens AI camera to track the movements of the printer's X, Y, and Z axes. The detected movements are processed by a Raspberry Pi Pico and transferred via a WIZnet Ethernet HAT. A Telegram bot is used to monitor the printer movements and receive notifications of any malfunctions.

→Learn More

 

 

 

Project 11. Autonomous (LIDAR) Litter Detection Robot w/ Edge Impulse

Introduction: This project presents an autonomous robot that uses object detection to track and monitor litter, aiming to combat environmental pollution. The robot uses a neural network model developed with Edge Impulse to detect litter in three categories: bottles, cans, and packaging. The model runs on a Raspberry Pi, and the robot is equipped with a 360-degree laser scanner and a USB webcam. The robot also features a fall detection system built with an Arduino Nano and a 6-axis accelerometer.

→Learn More

 

 

 

Project 12. FlowGuard: Real-time Water Management with IoT and AI

Introduction: FlowGuard is a project aimed at revolutionizing water usage monitoring and preventing wastage. It uses advanced sensors and algorithms to provide real-time water usage data and identify potential wastage areas. The system sends instant notifications when anomalies or excessive water usage are detected. It uses a water flow sensor, a FireBeetle microcontroller, and a Husky Lens for human presence detection. When excessive water usage is detected, an alert is triggered, and administrators are notified via email.

→Learn More

 

 

 

Project 13. Immersive Shooting Game w/ ML Facial Recognition

Introduction: The project that transforms a controller into a game device for airsoft games. It uses facial recognition to identify targets, providing an immersive gaming experience. The controller features a camera module for target detection and offers two game modes: IPSC and Unlimited. The project also includes an immersive design where players need to reload the magazine like a real firearm.

→Learn More

 

 

 

Project 14. Ping Pong Balls Collector - HuskyLens AI Robotic Ball Picker

Introduction: This project presents an autonomous AI ball-picking robot that can recognize, track, and pick up ping pong balls. The robot uses a HuskyLens vision sensor for ball recognition and an ultrasonic sensor for obstacle avoidance. The robot is controlled by a micro:bit gamepad remote-control handle. When a ball is not detected, the robot will adjust its angle or move a certain distance until a ball is recognized.

→Learn More

 

 

Interested in exploring the world of Large Language Models through hardware? The Black Friday sale at the DFRobot is in full swing!

 

DFRobot Black Friday Sales

 

From November 21st to 27th, a wide range of AI-related products and kits are available at incredible discounts, including single-board computers, AI voice recognition sensor, AI cameras, etc. Act now and embark on an exciting journey into the world of AI!

License
All Rights
Reserved
licensBg
0