icon

Building an Electronic Nose (Olfactory) with MEMS Gas Detection Sensor and Edge Impulse Platform

The electronic nose is an instrument that mimics the human olfactory system and is capable of detecting and identifying odors. Composed of multiple gas sensors and trained through tinyML, the electronic nose can recognize the unique characteristics of different scents. In this project, I developed an electronic nose using DFRobot mems series gas sensors and the Edge Impulse platform, which is capable of identifying the scents of beverages and fruits. Throughout the development process, I conducted experiments on the prediction of fruit decay maintenance. After completing the experiments on identifying the scents of beverages and fruits, I also conducted a series of exploratory experiments to test the electronic nose's ability to recognize other odors, in order to understand the limits of its capabilities. Through this project, we have confirmed that an electronic nose using DFRobot mems series gas sensors can distinguish between beverages and fruits, and has a certain degree of preventive ability against fruit decay. Furthermore, we have discovered that under certain conditions, the electronic nose can recognize multiple scents. We hope that this project will explore the potential of electronic nose technology and lay the foundation for the development of more widely applicable odor recognition applications in the future.

 

HARDWARE LIST
1 FireBeetle 2 ESP32-E IoT Microcontroller with Header (Supports Wi-Fi & Bluetooth)
1 Gravity: MEMS Gas Sensor (CO, Alcohol, NO2 & NH3) - I2C - MiCS-4514
1 DFRduino Mega2560 (Arduino Mega 2560 R3 Compatible)
1 Gravity: Serial Data Logger for Arduino
1 Gravity: IO Shield for FireBeetle 2 (ESP32-E/M0)
1 Gravity: I2C ADS1115 16-Bit ADC Module (Arduino & Raspberry Pi Compatible)
1 Mini Bread Board Self Adhesive
1 Gravity: I2C SD2405 RTC Module
2 Aluminum Heatsink Cooling Fan for LattePanda V1

Design Process:

 

In order to achieve the functionality of the electronic nose, I considered both the hardware and software aspects during the design process. After conducting a series of research, the design scheme and content are as follows:

 

1. Hardware

 

The sensor of the electronic nose is an important component of the whole project. The DFRobot Fermion MEMS series gas sensor uses MEMS technology to produce a micro-hotplate on a Si substrate. The gas-sensitive material used is a metal oxide semiconductor material with low conductivity in clean air. When the detected gas is present in the environment, the conductivity of the sensor changes. The higher the concentration of the gas, the higher the conductivity of the sensor. The structure of this series of sensors is sturdy, and a simple circuit can convert the change in conductivity into an output signal corresponding to the concentration of the gas. Its small size and low power consumption also make it easy to integrate with other devices.

 

To achieve the functionality of the electronic nose, I need to collect odor data and analyze it. Therefore, I chose the DFRobot MEMS series gas sensor to measure the components of the odor and output analog signals. In order to collect signals from these 10 MEMS sensors at the same time, I used MEGA2560 as the main board, and specially made a mega2560 shield for installing the combined sensor array. This array can detect the response of sensors to different odors and identify the main responsive sensor. At the same time, I also used two fans to allow the gas in the sealed space to circulate, so that the odor can be evenly distributed in the space.

In order to store sensor data, I also equipped it with a serial port logging module, which allows the MEGA2560 main control to store sensor data directly without the need for a computer. This ensures the accuracy and reliability of the data.

 

 

Electronic nose system based on Mega2560

 

The electronic nose system based on ESP32-E was designed after identifying the corresponding main sensors through experiments. The ADS1115 module can accurately collect and convert analog signals. And use a breadboard to place CH4, CO, Odor, and CH2O sensors.

 

 

Electronic nose system based on ESP32

 

2. Software Platform

 

To simplify the software development process, I chose Edge Impulse as the edge AI platform. Edge Impulse is the edge AI platform for enterprise teams building innovative products. It optimizes models and enables easy deployment to any edge device. The platform is specifically designed to handle real-world sensor data, accelerating product development and minimizing risks. (Edge Impulse is the edge AI platform for enterprise teams building innovative products. Optimize your models and deploy to any edge device with ease. Accelerate product development while minimizing risks with a platform designed to handle real-world sensor data.) As the platform (edgeimpulse.com) is user-friendly and provides detailed information on the website, I won't go into further detail here.

 

3. Beverage Odor Analysis

 

Many teams online have already used electronic noses to detect beverages with good results, so we first test the odor of the beverages to verify the basic ability of the electronic nose.

 

Data Collection and Preprocessing

 

In terms of data collection, I chose grape juice and cola, two common beverages, as experimental objects. I first placed each beverage in a sealed container and let it sit for ten minutes, then inserted the gas sensors into the container and began measuring the data. I measured each beverage multiple times and saved the data from each measurement on the computer, naming the resulting csv data files after the odor and labeling each row of data.

 

 

 

 

Coke data Grape juice data

 

From the comparison of the two charts, it can be seen that some gas sensors have obvious differences in output for the two smells.

 

 

 

Coke Chart Grape Juice Chart

 

 

Odor Data Distribution Map

 

 

Data Collected Train / Test Split

 

I used the Edge Impulse platform to train the model. The data was uploaded to the platform and each dataset was labeled as "grape juice" or "cola" using the tagging function. By sorting the feature correlations (how important features are for each class compared to all other classes), more important features for beverage odor analysis can be selected.

 

Model training

 

Then, I used the machine learning algorithm provided by the platform to train the model and optimize it on the platform. After multiple rounds of training and testing, I finally obtained a model that can classify the odor of grape juice and cola. The F1 SCORE is a metric for measuring the performance of binary classification models. When the F1 Score is equal to 1, it indicates that the model has good performance and high precision and recall. When the F1 Score is equal to 0, it indicates that the model's classification effect is very poor and can not correctly classify samples.

 

 

 

feature importance Model Training Performance

 

Experimental results

 

After training and testing, I obtained a relatively accurate model that can automatically identify and classify the odor of grape juice and cola. The model's classification accuracy reached 91.67%, indicating that it can effectively distinguish between these two beverage odors.

 

Model results

 

Based on the experiment, I found that the electronic nose I designed can successfully identify the difference in beverages, which is similar to the results obtained by most electronic nose teams.

 

4. fruit spoilage odor analysis

 

An electronic nose made with MEMS series sensors can run on embedded devices for a long time, collect the odor emitted by fruit, and use machine learning technology to analyze and identify the odor produced when fruit spoils. This can help us detect fruit spoilage in a timely manner, thereby reducing waste and loss.

 

Main application:

 

assessing food quality

 

refrigerator food odor detection

 

Data collection and preprocessing:

 

For data collection, I chose mangoes as the experimental object because they are a common fruit. The experimental items included slightly green mangoes and mangoes that had begun to rot. First, the sensor was preheated for five minutes, then the experimental items were placed in a container and the odor data was measured. After each measurement, the fan cleaned the container for five minutes and waited for the gas in the container to return to a clean state. I saved the data of each measurement as a CSV file, named after the odor, and labeled each row of data.

 

The sensor feature matrix was processed by t-SNE and distributed in a two-dimensional space, showing the three states of air data in a clustered form.

 

 

Sensor data spatial distribution map

 

model training

 

Using the EON Tuner platform, the model can be automatically trained based on the imported sensor data, and the model parameters and abilities can be adjusted according to the type of target controller selected in the upper right corner.

 

 

Model List

 

It can be seen that the accuracy of the model trained by EON Tuner is good, but there may be overfitting.

 

 

Model Results

 

Experimental results

 

Import the test set into the model to see the final score.

 

 

Model Test Results

 
Fruit spoilage maintenance management (using Firebeetle esp32-E as an example)

 

Mangoes that have been stored for a long time will quickly rot and black spots will appear on the surface. I used a combination of MEMS series CH4, CO, Odor, and CH2O sensors for a two-day mango odor detection. The ESP32-E controller is paired with the ADS1115 module and RTC module, and the sensor data is read every half hour. Before monitoring, the sensor has been preheated for half an hour. Fresh mangoes without obvious black spots were placed in the box. Due to the slightly lower nighttime temperature, the overall data curve showed a slight decline. It can be seen that after about eight hours, the sensor data has a significant increase, and small black spots gradually appear on the surface of the mango.

 

 

Mango smell data trend chart

 

 

Fresh mango with no obvious black spots on the surface

 

Mango with small black spots on the surface

Esp32 code:

CODE
#include <DFRobot_ADS1115.h>

    #include "GravityRtc.h"
    #include "Wire.h"

DFRobot_ADS1115 ads(&Wire);
GravityRtc rtc;     //RTC Initialization 

void setup(void)
{
  Serial.begin(115200);
  while (!Serial);
  Serial.println("Edge Impulse Inferencing Demo");
  ads.setAddr_ADS1115(ADS1115_IIC_ADDRESS0);   // 0x48
  ads.setGain(eGAIN_TWOTHIRDS);   // 2/3x gain
  ads.setMode(eMODE_SINGLE);       // single-shot mode
  ads.setRate(eRATE_128);          // 128SPS (default)
  ads.setOSMode(eOSMODE_SINGLE);   // Set to start a single-conversion
  ads.init();
  delay(20);
  rtc.setup();
  start_time = millis(); 
  Serial.print("CH4,");
  Serial.print("\t");
  Serial.print("CO,");
  Serial.print("\t");
  Serial.print("Odor");
  Serial.print("\t");
  Serial.println("CH2O");
}

void loop(void)
{
  rtc.read();
  if((rtc.minute % 30) == 0 && rtc.second == 1){
    for(byte i = 1; i < 5; i = i + 1){
      if (ads.checkADS1115()){
        int16_t adc0, adc1,adc2, adc3;
        adc0 = ads.readVoltage(0);
        adc1 = ads.readVoltage(1);
        adc2 = ads.readVoltage(2);
        adc3 = ads.readVoltage(3);
  
      float sensorValue0 = adc0;
      float sensorValue1 = adc1;
      float sensorValue2 = adc2;
      float sensorValue3 = adc3;
    
      Serial.print(String(sensorValue0));
      Serial.print("\t");
      Serial.print(String(sensorValue1));
      Serial.print("\t");
      Serial.print(String(sensorValue2));
      Serial.print("\t");
      Serial.println(String(sensorValue3));
    
      delay(1000);   
        }
 
      }
    
  }  
  
}

5. odor exploration

 

In order to further understand the ability of the electronic nose, we conducted some tests on other items. The main purpose is to test specific odors and understand their discrimination ability and limitations.

 
Sour: Lemon, Vinegar

 

For different types of sour tastes, the response of the sensor is different.

 

 

Lemon Smell Data Chart

 

 

Vinegar Smell Data Chart

 

Fishy: prawns

 

After preheating for 5 minutes, put fresh garlic in the box and collect data several times.

 

 

experiment procedure

 

The H2S sensor has a clear response with similar response curves.

 

 

Shrimp Smell Data Chart

 

Smelly: garlic

 

After preheating for 5 minutes, put fresh garlic in the box and collect data several times (10 minutes).

 

 

experiment procedure

 

Odor (smell sensor) and VOC (detection of volatile organic compounds) have a clear response to garlic.

 

 

Garlic Smell Data Chart

 

Natural: mint

 

Mint is a plant that emits a scent that is perceived by humans. After testing, the sensor is less sensitive to mint that has not been chopped.

 

 

Mint Smell Data Chart

 

6. model deployment and verification

 

After checking and analyzing the historical data, we have a better understanding of the capabilities and limitations of the sensor. In order to enable the device to make local judgments, we need to optimize the electronic nose model and deploy it locally to achieve local inference capabilities. We recorded the process of deploying the model to ESP32-E and verifying it.

 

Configuration environment (Windows)

 

1. Install Python 3 on your host computer.

 

2. Install Node.js v14 or higher on your host computer.

 

3. For Windows users, install the Additional Node.js tools (called Tools for Native Modules on newer versions) when prompted.

 

4. Install the CLI tools via: npm install -g edge-impulse-cli --force

 

You should now have the tools available in your PATH.

 

Data collection

 

FireBeetle ESP32-E main controller, with DFRobot I2C ADS1115 16-bit AD conversion module, can accurately collect and convert analog signals.

 

code:

CODE
#include <DFRobot_ADS1115.h>

DFRobot_ADS1115 ads(&Wire);
unsigned long start_time;  // 保存程序开始时的时间

void setup(void)
{
  Serial.begin(115200);
  while (!Serial);
  Serial.println("Edge Impulse Inferencing Demo");
  ads.setAddr_ADS1115(ADS1115_IIC_ADDRESS0);   // 0x48
  ads.setGain(eGAIN_TWOTHIRDS);   // 2/3x gain
  ads.setMode(eMODE_SINGLE);       // single-shot mode
  ads.setRate(eRATE_128);          // 128SPS (default)
  ads.setOSMode(eOSMODE_SINGLE);   // Set to start a single-conversion
  ads.init();
  start_time = millis();  // 记录程序开始时的时间
  //Serial.print("Elapsedtime(ms):,");
  //Serial.print("\t");
  Serial.print("CH4,");
  Serial.print("\t");
  Serial.print("CO,");
  Serial.print("\t");
  Serial.print("Odor");
  Serial.print("\t");
  Serial.println("CH2O");
}

void loop(void)
{
      if (ads.checkADS1115())
    {
        int16_t adc0, adc1,adc2, adc3;
        adc0 = ads.readVoltage(0);
        adc1 = ads.readVoltage(1);
        adc2 = ads.readVoltage(2);
        adc3 = ads.readVoltage(3);
        unsigned long current_time = millis();  // 获取当前时间
        unsigned long elapsed_time = current_time - start_time;  // 计算自程序开始以来经过的毫秒数
  
      float sensorValue0 = adc0;
      float sensorValue1 = adc1;
      float sensorValue2 = adc2;
      float sensorValue3 = adc3;
    
      Serial.print(String(sensorValue0));
      Serial.print("\t");
      Serial.print(String(sensorValue1));
      Serial.print("\t");
      Serial.print(String(sensorValue2));
      Serial.print("\t");
      Serial.println(String(sensorValue3));
    
      delay(1000);   
      }

}

Open WINDOWS POWERSHELL, enter $ edge-impulse-data-forwarder --frequency 1

 

Fill in the sensor name to connect to the platform.

 

 

In the Devices on the platform list, you can see that the ESP32-E device is online.

 

 

In the platform list, select Data Acquisition, then fill in the label, and click start sampling to start collecting data. The ratio of the training set and the test set in the neural network can be determined according to the specific situation, but in general, the data set is divided into a training set and a test set. The training set is used to train the model, and the test set is used to evaluate the performance of the model, that is, the ability of the model to predict unknown data.

 

 

Create impulse

 

 

Selecting Esp32 as the target board can make the generated model more in line with the computing power of the target board.

 

 

 

 

Choosing to represent the output of a machine learning model with 8-bit integers (int8) can significantly reduce the memory and computing resources required to run the model on low-power edge devices, which typically have limited computing resources. The int8 data format uses only one byte to represent each value, while float32 uses four bytes. This means that using int8 can reduce the size of the model by a factor of four. In addition, some hardware accelerators are optimized for int8 data types, which can provide faster and more efficient model execution. However, compared to float32, using int8 may reduce the precision of the model output, so a balance needs to be struck when choosing the appropriate data type.

 

 

Read the data of ESP32 and use live classification to predict the species.

 

 

 

 

After the above processing, the model suitable for ESP32 is ready, and we will verify the model below.

 

Model verification (take Firebeetle esp32-E, ArduinoIDE as an example)

 

Arduino IDE loads the zip library

 

Fill the primary sensor data into features[] and upload the code.

 

 

CODE
/* Edge Impulse ingestion SDK
 * Copyright (c) 2022 EdgeImpulse Inc.
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software 
 * distributed under the License is distributed on an "AS IS" BA SIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 *
 */

/* Includes ---------------------------------------------------------------- */
#include <esp32_fruit_only_inferencing.h>

static const float features[] = {
  1356.7017, 162.0838, 1648.8528, 381.1971, 1560.0804, 186.4678, 1897.0874, 438.4866, 1355.7012, 162.0838, 1649.8533, 381.1971, 1226.0417, 146.5643, 1492.6686, 344.7962
    // copy raw features here (for example from the 'Live classification' page)
    // see https://docs.edgeimpulse.com/docs/running-your-impulse-arduino
};

/**
 * @brief      Copy raw feature data in out_ptr
 *             Function called by inference library
 *
 * @param[in]  offset   The offset
 * @param[in]  length   The length
 * @param      out_ptr  The out pointer
 *
 * @return     0
 */
int raw_feature_get_data(size_t offset, size_t length, float *out_ptr) {
    memcpy(out_ptr, features + offset, length * sizeof(float));
    return 0;
}

void print_inference_result(ei_impulse_result_t result);

/**
 * @brief      Arduino setup function
 */
void setup()
{
    // put your setup code here, to run once:
    Serial.begin(115200);
    // comment out the below line to cancel the wait for USB connection (needed for native USB)
    while (!Serial);
    Serial.println("Edge Impulse Inferencing Demo");
}

/**
 * @brief      Arduino main function
 */
void loop()
{
    ei_printf("Edge Impulse standalone inferencing (Arduino)\n");

    if (sizeof(features) / sizeof(float) != EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE) {
        ei_printf("The size of your 'features' array is not correct. Expected %lu items, but had %lu\n",
            EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE, sizeof(features) / sizeof(float));
        delay(1000);
        return;
    }

    ei_impulse_result_t result = { 0 };

    // the features are stored into flash, and we don't want to load everything into RAM
    signal_t features_signal;
    features_signal.total_length = sizeof(features) / sizeof(features[0]);
    features_signal.get_data = &raw_feature_get_data;

    // invoke the impulse
    EI_IMPULSE_ERROR res = run_classifier(&features_signal, &result, false /* debug */);
    if (res != EI_IMPULSE_OK) {
        ei_printf("ERR: Failed to run classifier (%d)\n", res);
        return;
    }

    // print inference return code
    ei_printf("run_classifier returned: %d\r\n", res);
    print_inference_result(result);

    delay(1000);
}

void print_inference_result(ei_impulse_result_t result) {

    // Print how long it took to perform inference
    ei_printf("Timing: DSP %d ms, inference %d ms, anomaly %d ms\r\n",
            result.timing.dsp,
            result.timing.classification,
            result.timing.anomaly);

    // Print the prediction results (object detection)
#if EI_CLASSIFIER_OBJECT_DETECTION == 1
    ei_printf("Object detection bounding boxes:\r\n");
    for (uint32_t i = 0; i < result.bounding_boxes_count; i++) {
        ei_impulse_result_bounding_box_t bb = result.bounding_boxes[i];
        if (bb.value == 0) {
            continue;
        }
        ei_printf("  %s (%f) [ x: %u, y: %u, width: %u, height: %u ]\r\n",
                bb.label,
                bb.value,
                bb.x,
                bb.y,
                bb.width,
                bb.height);
    }

    // Print the prediction results (classification)
#else
    ei_printf("Predictions:\r\n");
    for (uint16_t i = 0; i < EI_CLASSIFIER_LABEL_COUNT; i++) {
        ei_printf("  %s: ", ei_classifier_inferencing_categories[i]);
        ei_printf("%.5f\r\n", result.classification[i].value);
    }
#endif

    // Print anomaly result (if it exists)
#if EI_CLASSIFIER_HAS_ANOMALY == 1
    ei_printf("Anomaly prediction: %.3f\r\n", result.anomaly);
#endif

} 

The code occupies esp32 memory:

 

Sketch uses 332537 bytes (25%) of program storage space. The maximum is 1310720 bytes.

Global variables use 24160 bytes (7%) of dynamic memory, leaving 303520 bytes for local variables. The maximum is 327680 bytes.

 

 

Output

 

Then use the live classification of the platform to collect air data, fresh mango data, and spoiled mango data multiple times.

 

Compared with the running results of edge impulse:

 

 

It can be seen that the processed model can run normally on the esp32, and the model recognition rate has reached 88.16%. Although the recognition rate is lower than that on the computer, basic recognition can still be completed to a certain extent.

 

The electronic nose realizes the identification of fruit spoilage based on a single-chip microcomputer, which provides a powerful tool and idea guidance for managing fresh food storage and cost reduction and efficiency increase in the future.

 

Summary

 

This is an electronic nose project that uses DFRobot MEMS gas sensors and the Edge Impulse platform to detect the scents of beverages and fruits. In this project, predictive experiments were conducted to explore whether the electronic nose can detect different item scents, and also to study the shelf life of mangoes. The experimental results showed that the DFRobot MEMS gas sensor electronic nose can distinguish between beverages and fruits. In addition, this project lays the foundation for developing more extensive scent recognition applications. There are also some areas of the project that can be expanded and improved, such as using more types of beverages or other items for scent recognition testing, and then implementing the model firmware into the main controller. At the same time, it can be seen from the experiment that this series of MEMS sensors are not suitable for smelling weakly volatile herbs, and changes in temperature and humidity may cause some data drift.

 

library:https://github.com/polamaxu/smartnose

 

Edge Impulse projects:

 

https://studio.edgeimpulse.com/studio/195877

https://studio.edgeimpulse.com/studio/227099/validation

 

FAQ

Q: Is there an error in configuring the environment in the Windows environment?

A: https://docs.edgeimpulse.com/docs/~/revisions/WOgRGOTQBrFnmbtextkF/edge-impulse-cli/cli-installation

 

Q: Missing C:\Users\DFRobot\AppData\Roaming\npm\node_modules\edge-impulse-cli\build\cli\daemon.js

A: Locate and change the daemon.js file location.

 

Q: Error: +CategoryInfo : SecurityError: (:) [], ParentContainsErrorRecordException +FullyQualifiedErrorId : UnauthorizedAccess?

A: Open PowerShell as administrator and enter: set-executionpolicy remotesigned

License
All Rights
Reserved
licensBg
0