Multilayer Perceptron Neural Network In Classifying Gender Using Fingerprint Global Level Features
Background/Objective: A new algorithms of gender classification from fingerprint is proposed based on Acree 25mm2 square area. The classification is achieved by extracting the global features from fingerprint images which is Ridge Density, Ridge Thickness to Valley Thickness Ratio (RTVTR) and White...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Indian Society Of Education And Environment & Informatics Publishing Limited
2016
|
Subjects: | |
Online Access: | http://eprints.utem.edu.my/id/eprint/17251/1/Multilayer%20Perceptron%20Neural%20Network%20In%20Classifying%20Gender%20Using%20Fingerprint%20Global%20Level%20Features.pdf http://eprints.utem.edu.my/id/eprint/17251/ http://www.indjst.org/index.php/indjst/article/view/84889 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Teknikal Malaysia Melaka |
Language: | English |
Summary: | Background/Objective: A new algorithms of gender classification from fingerprint is proposed based on Acree 25mm2 square area. The classification is achieved by extracting the global features from fingerprint images which is Ridge Density, Ridge Thickness to Valley Thickness Ratio (RTVTR) and White Lines Count. The objective of this study to test the effectiveness of the this new algorithm by looking the classification rate. Multilayer Perceptron Neural Network (MLPNN) used as a classifier. Methods: This new algorithm is tested with a database of 3000 fingerprint in which 1430 were male fingerprint and 1570 were female fingerprints. Classification part is tested with different test option. Findings: This study found that women tends to have higher Ridge Density, higher white lines count and higher ridge thickness to valley thickness ratio compared to male same as the previous study. Therefore, we can conclude that this new algorithm is very efficient and effective in classifying gender. Conclusion: The overall classification rate is 97.25% has been achieved. |
---|