DATA AUGMENTATION OF ELECTORNIC NOSE USING GAUSSIAN NOISE FOR BLACK TEA CLASSIFICATION WITH CONVOLUTIONAL NEURAL NETWORK
The quality of tea is determined by attributes such as appearance, aroma, color, and taste, each influenced by chemical compounds including polyphenols and flavonoids. Traditional evaluations by tea testers are subjective and often inconsistent, whereas analytical methods such as HPLC and GC offer h...
Saved in:
Main Author: | |
---|---|
Format: | Theses |
Language: | Indonesia |
Online Access: | https://digilib.itb.ac.id/gdl/view/87575 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Institut Teknologi Bandung |
Language: | Indonesia |
Summary: | The quality of tea is determined by attributes such as appearance, aroma, color, and taste, each influenced by chemical compounds including polyphenols and flavonoids. Traditional evaluations by tea testers are subjective and often inconsistent, whereas analytical methods such as HPLC and GC offer high accuracy but at a higher cost. Electronic nose (e-nose) technology provides a cost- effective alternative by mimicking the olfactory system to analyze volatile compounds. Deep learning, particularly convolutional neural networks (CNNs), facilitates end-to-end learning and automated feature extraction, while data augmentation techniques such as Gaussian noise can enhance model performance.
This study investigates whether adding Gaussian noise as a data augmentation technique can improve the learning process of e-nose models or simply introduce irrelevant noise. We evaluated the impact of data augmentation on learning outcomes under various noise levels and compared classification performance with and without noise augmentation to assess effects on model generalization and accuracy. Gaussian noise was applied at six controlled variance levels (0.01, 0.03, 0.05, 0.1, 0.2, and 0.3), and its influence on classification was examined using 1D- CNN and 2D-CNN architectures. Statistical analysis confirms that noise augmentation is effective at lower noise levels, preserving the original data structure while introducing realistic variations. These variations help reduce model bias by expanding the training data distribution.
Experimental results reveal that without noise augmentation, the 1D-CNN architecture achieves 87% accuracy. When Gaussian noise is introduced, Zhou’s 1D-CNN model maintains consistent accuracy at low to moderate noise levels, reaching up to 96%, indicating robustness against added interference. In contrast, the 2D-CNN model, especially ResNet34 with transfer learning, exhibits its highest performance (98.66% accuracy) at a low noise level (0.01), followed by a gradual decline as noise increases. This finding suggests that elevated noise levels distort data and obscure critical patterns, hindering classification. Overall, Gaussian
vi
noise augmentation effectively enhances model learning, although its benefits are diminished at higher noise levels. Comparative analysis of models with and without noise augmentation underscores the importance of carefully calibrating noise levels to maintain a balance between data variability and the integrity of underlying patterns |
---|