AI-driven Web-based Ancillary Lab Assistant | UNO Q & Gemini

HARDWARE LIST
1 Arduino UNO Q 4GB
1 UGREEN 5-in-1 100W USB-C Hub with 4K@60Hz HDMI and 3 * USB-A 3.0
1 A4 Tech PK-910H USB Webcam (1080p)
1 Raspberry Pi 15W 5.1V / 3.0A USB-C Power Supply
1 Custom PCB
1 DFRobot Capacitive Fingerprint Sensor (UART)
1 DFRobot Gravity: Electrochemical Alcohol Sensor
1 DFRobot Gravity: 1Kg Weight Sensor Kit - HX711
1 DFRobot Gravity: Geiger Counter Module
1 DFRobot Gravity: Electrochemical Nitrogen Dioxide Sensor
1 Seeed Studio Grove: Integrated Pressure Sensor Kit (MPX5700AP)
1 Seeed Studio Grove: Water Atomization Sensor (Ultrasonic)
1 DFRobot Gravity: GNSS Positioning Module
1 Waveshare 1.28" Round LCD Display Module (GC9A01)
4 Button (6x6)
1 5 mm Common Anode RGB LED
3 220ฮฉ Resistor
1 60 mm Petri Dish
1 M2 Screws, Nuts, and Washers
1 USB Buck-Boost Converter Board
1 Jumper Wires
1 Bambu Lab A1 Combo

After hearing about the launch of the brand-new Arduino UNO Q, designed as the first SBC (single-board computer) with Arduino's philosophy of bridging the gap between employing professional development tools and implementing them as novices when creating introductory projects or as experts while prototyping complex mechanisms rapidly yet stably, I thought it would be a great opportunity to redesign my previous AI-driven lab assistant project and enable more developers, beginner or expert, to replicate, experiment, or improve this new AI-based ancillary lab assistant thanks to the built-in Arduino UNO Q features and its beginner-friendly development platform โ€” Arduino App Lab.

As you may know, if you have read one of my previous project tutorials, I prefer building my AIoT projects on the target development boards and environments from scratch and enjoy developing unique methods, applications, and mechanisms to collect custom training data and achieve intended device features, strictly following my methodology of developing proof-of-concept research projects. Nonetheless, in this project, I heavily focused on developing all lab assistant features based on the provided UNO Q and Arduino App Lab characteristics, such as the built-in Bricks, native microprocessor-microcontroller communication procedure, and Linux-oriented SBC board architecture, to ensure that anyone with a UNO Q can effortlessly replicate and examine this lab assistant without needing to have a deep understanding of all aspects of this project; coding, web design, neural network training, LLM-implementation, 3D modeling, etc. In this regard, I hope this project serves as an entry point for developing research projects, encouraging readers to reverse-engineer the features of this AI-driven lab assistant to gain a deeper understanding of AIoT development on the edge.

As I was taking inspiration from my previous lab assistant project, I heavily modified the device structure and added a lot of new features specific to this iteration, for instance, designing a unique PCB (UNO Q shield) for utilizing various lab sensors to conduct LLM-assisted basic lab experiments. After months of hard work, I managed to complete the reimagined AI-driven ancillary lab assistant structure and develop all the features I envisioned on UNO Q by solely employing the Arduino App Lab development environment, providing foundational building blocks (Bricks).

๐Ÿค– To build the ancillary lab assistant structure:

โœ๐Ÿป I designed a unique PCB as a UNO Q shield (hat) to connect the selected lab sensors and create the analog lab assistant interface, including the capacitive fingerprint sensor.

โœ๐Ÿป Then, I modeled 3D parts to design the ancillary lab assistant base, containing the USB camera and the analog interface.

โœ๐Ÿป Finally, I designed a modular lab sensor ladder, organizing all sensors and secondary experiment tools, to create a compact but easy-to-use instrument.

๐Ÿค– To accomplish all of the ancillary lab assistant features I contemplated, performed by an Arduino App Lab application:

๐Ÿ› ๏ธ I trained an Edge Impulse object detection model to identify various lab equipment.

๐Ÿ› ๏ธ I programmed the MCU (STM32) to collect real-time sensor information and manage the analog lab assistant interface.

๐Ÿ› ๏ธ I developed a feature-rich web dashboard as the primary user interface and control panel of the lab assistant, hosted directly by the Arduino App Lab.

๐Ÿ› ๏ธ I incorporated Google Gemini to enable the lab assistant to generate LLM-based lessons about the detected lab equipment.

๐Ÿ› ๏ธ Thanks to the built-in background Linux MPU-MCU communication service (Arduino Router), I built the interconnected interface background in Python, handling the data transfer between the web dashboard, the analog interface (MCU), and the Qualcomm QRB (MPU) running the essential App Lab Bricks (Docker containers); database registration, inference running, web dashboard (UI) hosting, etc.

๐Ÿค– The finalized ancillary lab assistant allows users to:

๐Ÿ”ฌ create web dashboard accounts and sign in via fingerprint authentication,

๐Ÿ”ฌ monitor real-time lab sensor readings via the analog interface or the web dashboard,

๐Ÿ”ฌ inspect LLM-generated sensor guides and experiment tips for each lab sensor via the web dashboard,

๐Ÿ”ฌ capitalizing on the built-in browser text-to-speech (TTS) module, listen to LLM-generated sensor guides and experiment tips,

๐Ÿ”ฌ identify lab equipment via the provided Edge Impulse FOMO object detection model,

๐Ÿ”ฌ use the predefined equipment questions or enter a specific one to generate AI lessons through Google Gemini,

๐Ÿ”ฌ access the list of LLM-generated lessons assigned to your account on the web dashboard anytime,

๐Ÿ”ฌ study LLM-generated lessons by reading or listen to them via the TTS module.

๐Ÿค– To review code and design files with thorough instructions, you can check the project's GitHub repository or the full project tutorial on Hackster.

License
All Rights
Reserved
licensBg
0