AI Powered Banana Ripeness Monitoring

Bananas are typically harvested green and undergo a lengthy transportation process, often crossing oceans, before reaching grocery store shelves. Once "activated" (ripened), they are distributed to ensure optimal ripeness upon sale.

 

Consumers often assess fruit quality based on color. A desirable banana is golden yellow with minimal browning. Excessive browning or dark spots indicate overripeness or spoilage.

 

Artificial intelligence can accurately determine banana ripeness using trained data. However, deploying AI models on edge devices can pose practical and cost challenges due to potential overspecification. UNIHIKER offers a practical solution with sufficient computing power for efficient inference while maintaining a compact form factor. Its additional features, including flash memory, an onboard display, inputs, and sensors, provide added value compared to similar single-board computers.

Case problem to be addressed by this project are following:

(1) You work in the 24/7 convenient store that sells banana. 

(2) Your store temperature is controlled by airconditioning system so the ambient temperature and humidity inside the store is pretty stable during the day.

(3) Recently banana sales has been on decline thus the overripe banana should be taken out from the display and you need to order a fresh ripe banana timely to arrive so that the store always have banana on the display.

(4) As it is only you working in 12 hours shift, it is tough to monitor the ripeness level and especially during the time of day when the store is at busiest.

(5) So you think of an AI assist to "keep an eye" to these banana to record time when banana enters the display, how many days it takes to become overripe so that you could order the replacement soon.

Therefore there is a need to build an AI Gadget that helps to detect banana ripeness level.

 

Hardware:

(1) UNIHIKER; 

(2) USB Webcamera; 

(3) PC/Laptop for training 

 

Software:

(1) Python 3.11 on Miniconda running in UNIHIKER; 

(2) Banana ripeness dataset from Kaggle; 

(3) Roboflow account to label and download dataset to PC in Yolo format; 

 

The project framework is illustrated above. And here are steps taken to develop the solution:

STEP 1
Install Miniconda and Yolov8

Follow the steps written here https://unihiker.com/new-13932.html to get the miniconda installed. It is important as currently UNIHIKER is shipped with older Python 3.7.3 which is not suitable for Yolov8. As did many things before with the board (as part of getting to know), I met difficulties to install Ultralytics. However, after reflashing the UNIHIKER image (check here), the installation went smooth. 

 

STEP 2
Develop the Dataset

The development of dataset described in this step takes place entirely in PC, not in the UNIHIKER. 

 

To streamline the dataset creation process, we can leverage a pre-existing banana ripeness dataset available on the Kaggle, link is provided in the reference part of this article. This pre-prepared dataset will provide a solid foundation for training our banana ripeness detection model, eliminating the need for extensive manual data collection. 

 

To begin, download the dataset from Kaggle and create a Roboflow account and setup a new project. Once new Roboflow project is created,  upload the dataset to project. 

STEP 3
Model development and Training on PC

Now we are ready to develop a yolo custom model on PC. Create a python file or jupyter notebook and paste this code to begin downloading the dataset, to ensure the dataset is structured correctly for YOLOv8 training. You may need to install ultralytics on your PC using pip before running the script.

CODE
from roboflow import Roboflow
rf = Roboflow(api_key="kztwngveNCTsQVvy8cbM")
project = rf.workspace("unihiker").project("banana_ripeness-mlog2")
dataset = project.version(1).download("yolov8")

The dataset download time may vary. Once the dataset download is complete run the following syntax  on the terminal/shell to begin training the model on your CPU. This process may take some time to complete. 

 

 

CODE
yolo task=detect mode=train model=yolov8s.pt data={dataset.location}/data.yaml epochs=25 imgsz=320 plots=True

Once the training process complete, Locate the .pt file inside the /runs/detect/train/weights folder. This file is yolo custom model.

STEP 4
Deploy custom model to UNIHIKER

The deployment of model in UNIHIKER is simple copy-paste the custom yolo model .pt file. Once copied to your UNIHIKER folder create the a Python file on the UNIHIKER for inferencing.  This script allows you to load the model only once and then start or stop inference using a button press on the UNIHIKER. This approach avoids the time-consuming process of reloading the model for each inference run.

CODE
import cv2
import imutils
import time
from pinpong.board import Board, Pin
from pinpong.extension.unihiker import GD32Sensor_buttonA,GD32Sensor_buttonB
from ultralytics import YOLO

Board().begin()

btnA = Pin(Pin.P27, Pin.IN)
btnB = Pin(Pin.P28, Pin.IN)
model = YOLO("yolov10/yolov8_banana_320.pt", task='detect')


def capture_frame():
    cv2.destroyAllWindows()
    video = cv2.VideoCapture('/dev/video0')
    video.set(cv2.CAP_PROP_BUFFERSIZE, 1)
    cv2.namedWindow('winname',cv2.WND_PROP_FULLSCREEN)
    cv2.setWindowProperty('winname', cv2.WND_PROP_FULLSCREEN, cv2.WINDOW_FULLSCREEN)

    fps = 0
    frame_count = 0
    start_time = time.time()

    PATH = 'image'
    print('Begin capture...')

    while True:
        valA = btnA.read_digital()
        
        ret, frame = video.read()
        if not ret:
            break
        frame = imutils.resize(frame,320)


        frame_count += 1
        elapsed_time = time.time() - start_time
        if elapsed_time > 1.0:
            fps = frame_count / elapsed_time
            start_time = time.time()
            frame_count = 0

        fps_text = f"FPS: {fps:.2f}"
        cv2.putText(frame, fps_text, (10, 20), cv2.FONT_HERSHEY_SIMPLEX, 0.4, (255, 255, 255), 2)

        frame = imutils.rotate_bound(frame,270)    
        cv2.imshow("winname",frame)
        
        if cv2.waitKey(1) and valA == 0:
            cv2.imwrite(f'{PATH}/output.png',frame)
            break
            
    video.release()
    
    print('Image captured, loading model...')

    print('Model loaded, start prediction')
    results = model('image/output.png', conf=0.4)

    for result in results[0].boxes:
        print(result)
        x1, y1, x2, y2 = map(int, result.xyxy[0])
        confidence = result.conf[0]
        label = result.cls[0]
        
        # Draw bounding box and label on the frame
        cv2.rectangle(frame, (x1, y1), (x2, y2), (0, 255, 0), 2)
        text = f'{label}: {confidence:.2f}'
        cv2.putText(frame, text, (x1, y1 - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 255, 0), 2)
    
    print('Prediction done, saving file')
    cv2.imwrite('image/output_result.png',frame)
    img=cv2.imread('image/output_result.png')
    cv2.imshow("winname",img)
    cv2.waitKey(0)

    print('File saved, completed')

while True:
    valA = btnA.read_digital()
    if valA == 0:
        capture_frame()
        
    time.sleep(0.001)

Here is the result. Banana detected is at ripeness degree 5 out of 8.

While this project has certain limitations, I am confident that UNIHIKER will facilitate the implementation of necessary improvements, positioning it as a leading choice for AI-powered gadgets. Hope you find this project useful.

 

 

 

 

 

 

Reference:

Banana ripeness dataset (kaggle.com)

Banana ripeness Roboflow project. https://universe.roboflow.com/unihiker/banana_ripeness-mlog2/dataset/1

 

License
All Rights
Reserved
licensBg
0