Insect detection and monitoring in stored grains using MFCCs and artificial neural network

The variability in grain production makes it necessary to have strategic grain storage plans in order to ensure adequate supplies at all times. However, insects in stored grain products cause infestation and contamination which reduce grain quality and quantity. In order to prevent these problems, e...

Full description

Saved in:
Bibliographic Details
Main Authors: Santiago, Robert Martin C., Rabano, Stephenn L., Billones, Robert Kerwin D., Calilung, Edwin J., Sybingco, Edwin, Dadios, Elmer P.
Format: text
Published: Animo Repository 2017
Subjects:
Online Access:https://animorepository.dlsu.edu.ph/faculty_research/1340
https://animorepository.dlsu.edu.ph/context/faculty_research/article/2339/type/native/viewcontent
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: De La Salle University
Description
Summary:The variability in grain production makes it necessary to have strategic grain storage plans in order to ensure adequate supplies at all times. However, insects in stored grain products cause infestation and contamination which reduce grain quality and quantity. In order to prevent these problems, early detection and constant monitoring need to be implemented. Acoustic methods have been established in numerous studies as a viable approach for insect detection and monitoring with various sound parameterization and classification techniques. The aim of this study is to further demonstrate the efficacy of acoustic methods in pest management mainly through feature extraction using Mel-frequency cepstral coefficients (MFCCs) and classification using artificial neural network. The study used sounds from the Sitophilus oryzae (L.) or commonly known as rice weevil in larval stage recorded using five different acoustic sensors with the purpose of proving the capability of artificial neural network to recognize insect sounds regardless of the acoustic sensors used. Network models with varying number of nodes for the hidden layer were experimented in search for the highest accuracy that may be obtained. Results show that the network with 25 nodes for the hidden layer provides the best over-all network performance with 94.70% accuracy and the training, validation, and testing are accurate at 95.10%, 94.00%, and 93.60% respectively. Although, difference in accuracy values across all simulations never exceeded 1%. These show that the proposed method is capable of recognizing insect sounds regardless of the acoustic sensors used provided that proper acoustic signal preprocessing, feature extraction, and implementation of the network are performed. © 2017 IEEE.