NAME
Algorithm::FeatureSelection -
SYNOPSIS
use Algorithm::FeatureSelection;
my $fs = Algorithm::FeatureSelection->new();
# feature-class data structure ...
my $features = {
feature_1 => {
class_a => 10,
class_b => 2,
},
feature_2 => {
class_b => 11,
class_d => 32
},
.
.
.
};
# get pairwise-mutula-information
my $pmi = $fs->pairwise_mutual_information($features);
my $pmi = $fs->pmi($features); # same above
# get information-gain
my $ig = $fs->information_gain($features);
my $ig = $fs->ig($features); # same above
DESCRIPTION
This library is an perl implementation of 'Pairwaise Mutual Information' and 'Information Gain' that are used as well-known method of feature selection on text mining fields.
METHOD
new()
information_gain( $features )
my $features = {
feature_1 => {
class_a => 10,
class_b => 2,
},
feature_2 => {
class_b => 11,
class_d => 32
},
.
.
.
};
my $fs = Algorithm::FeatureSelection->new();
my $ig = $fs->information_gain($features);
ig( $features )
short name of information_gain()
information_gain_ratio( $features )
my $features = {
feature_1 => {
class_a => 10,
class_b => 2,
},
feature_2 => {
class_b => 11,
class_d => 32
},
.
.
.
};
my $fs = Algorithm::FeatureSelection->new();
my $igr = $fs->information_gain_ratio($features);
igr( $features )
short name of information_gain_ratio()
pairwise_mutual_information( $features )
my $features = {
feature_1 => {
class_a => 10,
class_b => 2,
},
feature_2 => {
class_b => 11,
class_d => 32
},
.
.
.
};
my $fs = Algorithm::FeatureSelection->new();
my $pmi = $fs->pairwise_mutual_information($features);
pmi( $features )
short name of pairwise_mutual_information()
entropy(HASH|ARRAY)
calcurate entropy.
AUTHOR
Takeshi Miki <miki@cpan.org>
SEE ALSO
LICENSE
This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself.