Feature Selection with Mutual Information for Regression Problems
Selecting relevant features for machine learning modeling improves the performance of the learning methods. Mutual information (MI) is known to be used as relevant criterion for selecting feature subsets from input dataset with a nonlinear relationship to the predicting attribute. However, mutu...
Saved in:
Main Authors: | , |
---|---|
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2015
|
Subjects: | |
Online Access: | http://ir.unimas.my/id/eprint/13447/1/Feature%20Selection%20with%20Mutual%20Information%20%28abstract%29.pdf http://ir.unimas.my/id/eprint/13447/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Malaysia Sarawak |
Language: | English |
Summary: | Selecting relevant features for machine learning
modeling improves the performance of the learning methods.
Mutual information (MI) is known to be used as relevant
criterion for selecting feature subsets from input dataset with a
nonlinear relationship to the predicting attribute. However,
mutual information estimator suffers the following limitation; it
depends on smoothing parameters, the feature selection greedy
methods lack theoretically justified stopping criteria and in
theory it can be used for both classification and regression
problems, however in practice more often it formulation is
limited to classification problems. This paper investigates a
proposed improvement on the three limitations of the Mutual
Information estimator (as mentioned above), through the use of
resampling techniques and formulation of mutual information
based on differential entropic for regression problems. |
---|