NAME
AI::MXNet::Metric - Online evaluation metric module.
DESCRIPTION
Base class of all evaluation metrics.
NAME
AI::MXNet::Perplexity
DESCRIPTION
Calculate perplexity.
Parameters
----------
ignore_label : int or undef
index of invalid label to ignore when
counting. usually should be -1. Include
all entries if undef.
axis : int (default -1)
The axis from prediction that was used to
compute softmax. By default uses the last
axis.
NAME
AI::MXNet::PearsonCorrelation
DESCRIPTION
Computes Pearson correlation.
Parameters
----------
name : str
Name of this metric instance for display.
Examples
--------
>>> $predicts = [mx->nd->array([[0.3, 0.7], [0, 1.], [0.4, 0.6]])]
>>> $labels = [mx->nd->array([[1, 0], [0, 1], [0, 1]])]
>>> $pr = mx->metric->PearsonCorrelation()
>>> $pr->update($labels, $predicts)
>>> print pr->get()
('pearson-correlation', '0.421637061887229')
DESCRIPTION
Custom evaluation metric that takes a sub ref.
Parameters
----------
eval_function : subref
Customized evaluation function.
name : str, optional
The name of the metric
allow_extra_outputs : bool
If true, the prediction outputs can have extra outputs.
This is useful in RNN, where the states are also produced
in outputs for forwarding.
create
Create an evaluation metric.
Parameters
----------
metric : str or sub ref
The name of the metric, or a function
providing statistics given pred, label NDArray.