Why not adopt me?
NAME
AI::ANN::Evolver - an evolver for an artificial neural network simulator
VERSION
version 0.008
METHODS
new
AI::ANN::Evolver->new( { mutation_chance => $mutationchance, mutation_amount => $mutationamount, add_link_chance => $addlinkchance, kill_link_chance => $killlinkchance, sub_crossover_chance => $subcrossoverchance, min_value => $minvalue, max_value => $maxvalue } )
All values have a sane default.
mutation_chance is the chance that calling mutate() will add a random value on a per-link basis. It only affects existing (nonzero) links. mutation_amount is the maximum change that any single mutation can introduce. It affects the result of successful mutation_chance rolls, the maximum value after an add_link_chance roll, and the maximum strength of a link that can be deleted by kill_link_chance rolls. It can either add or subtract. add_link_chance is the chance that, during a mutate() call, each pair of unconnected neurons or each unconnected neuron => input pair will spontaneously develop a connection. This should be extremely small, as it is not an overall chance, put a chance for each connection that does not yet exist. If you wish to ensure that your neural net does not become recursive, this must be zero. kill_link_chance is the chance that, during a mutate() call, each pair of connected neurons with a weight less than mutation_amount or each neuron => input pair with a weight less than mutation_amount will be disconnected. If add_link_chance is zero, this should also be zero, or your network will just fizzle out. sub_crossover_chance is the chance that, during a crossover() call, each neuron will, rather than being inherited fully from each parent, have each element within it be inherited individually. min_value is the smallest acceptable weight. It must be less than or equal to zero. If a value would be decremented below min_value, it will instead become an epsilon above min_value. This is so that we don't accidentally set a weight to zero, thereby killing the link. max_value is the largest acceptable weight. It must be greater than zero. gaussian_tau and gaussian_tau_prime are the terms to the gaussian mutation method. They are coderefs which accept one parameter, n, the number of non-zero-weight inputs to the given neuron.
crossover
$evolver->crossover( $network1, $network2 )
Returns a $network3 consisting of the shuffling of $network1 and $network2 As long as the same neurons in network1 and network2 are outputs, network3 will always have those same outputs. This method, at least if the sub_crossover_chance is nonzero, expects neurons to be labeled from zero to n. You probably don't want to do this. This is the least effective way to evolve neural networks. This is because, due to the hidden intermediate steps, it is possible for two networks which output exactly the same with completely different internal representations.
mutate
$evolver->mutate($network)
Returns a version of $network mutated according to the parameters set for $evolver, followed by a series of counters. The original is not modified. The counters are, in order, the number of times we compared against the following thresholds: mutation_chance, kill_link_chance, add_link_chance. This is useful if you want to try to normalize your probabilities. For example, if you want to make links be killed about as often as they are added, keep a running total of the counters, and let: $kill_link_chance = $add_link_chance * $add_link_counter / $kill_link_counter This will probably make kill_link_chance much larger than add_link_chance, but in doing so will make links be added at overall the same rate as they are killed. Since new links tend to be killed particularly quickly, it may be wise to add an additional optional multiplier to mutation_amount just for new links.
mutate_gaussian
$evolver->mutate_gaussian($network)
Returns a version of $network modified according to the Gaussian mutation rules discussed in X. Yao, Evolving Artifical Neural Networks, and X. Yao and Y. Liu, Fast Evolution Strategies. Uses the gaussian_tau and gaussian_tau_prime values from the initializer if they are present, or sane defaults proposed by the above. These are both functions of 'n', the number of inputs to each neuron with nonzero weight.
AUTHOR
Dan Collins <DCOLLINS@cpan.org>
COPYRIGHT AND LICENSE
This software is Copyright (c) 2011 by Dan Collins.
This is free software, licensed under:
The GNU General Public License, Version 3, June 2007