UIUC ME 461: Cat-and-Mouse

0 1579 Medium

The balancing “Segbot” (mouse) receives commands wirelessly from LabView, to escape the deadly claws of the all-seeing robot cat.

projectImage

Things used in this project

Hardware components

HARDWARE LIST
Texas Instruments LAUNCHXL-F28379D C2000 Delfino LaunchPad
DFRobot FireBeetle ESP32 IOT Microcontroller (Supports Wi-Fi & Bluetooth)
Ultrasonic Sensor - HC-SR04 (Generic)

Hand tools and fabrication machines

Soldering iron (generic)

Story

 

Introduction

 

Inspired by the Tom and Jerry cartoon, the goal of this project was to build a cat-and-mouse game where the cat robot would chase the mouse robot, controlled wirelessly through LabView. Both the cat (robot car) and mouse (SegBot) build upon the kit and instructions followed by Dan Block, ME 461 instructor at the University of Illinois Urbana-Champaign.

 

To facilitate the chasing, both robots will send their current position (x and y relative to the origin) to LabView. The current position is calculated based on the wheel position measurement from the quadrature encoders, located next to each motor. In addition to the quadrature encoders, the cat can also update its current position using its front ultrasonic sensor and measuring the distance from two orthogonal walls. With the two robot’s positions, LabView will plot the robots on a grid map, representing the arena. Also, LabView will send the mouse’s coordinates to the cat and the user’s commands to the mouse.

projectImage
projectImage

Robot Communication

 

An essential part of this project was the communication between the F28379D board and LabView. To facilitate communication between the two, each robot’s F28379 board is connected to an ESP32 board, which provides WiFi. The F28379 and ESP32 boards communicate with each other via the UART protocol. Shown below are the ESP32 pinout reference and the wiring diagram of the ESP32 board. The SCIRXDD and SCIRXDD pins correspond to the Serial D of the F28379 Board.

projectImage
projectImage

Next, the ESP32 runs the Arduino Code, which connects to the Wifi and starts the TCP server for LabView to connect to. The TCP server allows the ESP32 to echo data from the F28379D board to LabView and vice versa. Shown below is a detailed diagram illustrating the flow of data between the F28379, ESP32, and LabView.

projectImage

LabView

projectImage
projectImage
projectImage

There are three sections in the LabView VI, shown above. In the first section, there are two sets of gray-out textboxes, which show the message received from the ESP32. The green LED enables (On) or disables (Off) the sending of the mouse’s position to the cat. The Update button sends a command telling the cat to update its current position with the ultrasonic sensor. The Chase button sends a command to the cat to move to the mouse’s position and try to catch it.

 

The second section is the location map, which tracks the position of the mouse (green circle) and cat (red circle). The third section is the user’s commands to control the mouse. The ‘W’ and ‘S’ keys move the mouse forward and backward, while the ‘A’ and ‘D’ keys turn the mouse left and right. Each push of the key increases the speed in that direction, so the ‘R’ key reset the speed back to zero.

Ultrasonic Sensor

projectImage

For distance measurement, the HC-SR04 ultrasonic sensor was chosen since its ideal range is between 2 cm and 400 cm. The sensor has a simple wiring layout with 4 pins: power supply, trigger, echo, and ground. The trigger pin triggers the ultrasonic waves and the echo pin sends the information to the connected device.

 

MouseChase

 

After the communication was established, the cat should get the position of the mouse. The mouse is controlled via the LabView interface and self-balanced itself in an upwards position as it moves to different locations. To ensure that the mouse does not move from the location due to the self-balancing motion, a velocity controller was added to regularize the tilt control.

 

The mouse's starting position is (1 ft, 1 ft) from the origin. From this position, the new current position is calculated using the angular position of the wheels, which drifts or deviates after traveling a long distance.

The cat can start at any arbitrary position on the field. The only constraint is to place it looking in the x-axis direction, to ensure a fixed starting angle. Before the chase begins, the cat needs to determine its position relative to the origin. To do so, it turns -90° and measures the distance to the field wall, and repeats one time. The ultrasonic sensor provides reliable measuring results, which have a deviation of around 1cm, depending on the reflecting surface. After the position is measured, the cat will start chasing when the chase command is received from LabView.

projectImage

With the mouse's position and its own position, it then calculates the angle needed to rotate towards the mouse. After the cat turns to the desired angle, it will move towards the mouse's position and stop shortly before reaching the position. It will then search for the mouse with the ultrasonic sensor by turning left and right. If the mouse is found, the chase is over, otherwise, it will wait for the chase command to be sent.

 

The rotation angle of the cat is determined through the Kalman Filtering of the combined gyro and wheels angular position measurements. If the cat is stationary, the position angle is only determined by the encoder to avoid the drifting error of the gyroscope. During a turn, the yaw angle (φ) measurement is mainly done by the gyroscope. But, after a few turns, the yaw angle drifts away from the actual angle. Small deviations can be neglected, but if the difference between the true and the calculated position gets too large it is necessary to reset the game.

 

To compensate for the rotation error from the compass data of the MPU-9250, the gyroscope should also be used. Due to difficulties during the calibration of the compass, this idea was discarded. Another way to develop the project further would be the implementation of a camera and a video searching algorithm. If the chaser is driving towards the mouse it could compensate for small position errors with such an algorithm.

 

Below are videos of our robot working.

Here is a video explaining the wiring of the ESP32 and Ultrasonic Sensor

Code

icon Final_Project_Code.zip 1.97MB Download(4)

The article was first published in hackster, December 17, 2021

cr: https://www.hackster.io/440492/uiuc-me-461-cat-and-mouse-81c942

author: Jin Wu, Rinney, Alp Gundes

License
All Rights
Reserved
licensBg
0