NAME

AI::MXNet::AutoGrad - Autograd for NDArray.

DESCRIPTION

Auto gradients differentiation for dynamic graphs, primarily used with Gluon.

SYNOPSIS

use AI::MXNet qw(mx);
my $x = mx->nd->ones([1]);
$x->attach_grad;
my $z;
mx->autograd->record(sub {
    $z = mx->nd->elemwise_add($x->exp, $x);
});
my $dx = mx->autograd->grad($z, $x, create_graph=>1);
ok(abs($dx->asscalar - 3.71828175) < 1e-7);
$dx->backward;
ok(abs($x->grad->asscalar - 2.71828175) < 1e-7);

set_is_training

Set status to training/not training. When training, graph will be constructed
for gradient computation. Operators will also run with $is_train=1. For example,
Dropout will drop inputs randomly when is_train=True while simply passing through
if $is_train=0.

Parameters
----------
$is_train: Bool

Returns
-------
previous state before this set.

set_is_recording

Set status to recording/not recording. When recording, graph will be constructed
for gradient computation.

Parameters
----------
$is_recoding: Bool

Returns
-------
previous state before this set.

is_recording

Get status on recording/not recording.

Returns
-------
Current state of recording.

is_training

Get status on training/predicting.

Returns
-------
Current state of training/predicting.

mark_variables

Mark AI::MXNet::NDArrays as variables to compute gradient for autograd.

Parameters
----------
ArrayRef[AI::MXNet::NDArray] $variables
ArrayRef[AI::MXNet::NDArray] $gradients
GradReq|ArrayRef[GradReq]   :$grad_reqs='write'

backward

Compute the gradients of heads w.r.t previously marked variables.

Parameters
----------
$heads: ArrayRef[AI::MXNet::NDArray]
    Output NDArray(s)
:$head_grads=: Maybe[AI::MXNet::NDArray|ArrayRef[AI::MXNet::NDArray|Undef]]
    Gradients with respect to heads.
:$retain_graph=0: Bool, optional
    Whether to retain graph.
:$train_mode=1: Bool, optional
    Whether to do backward for training or predicting.

compute_gradient

Compute the gradients of outputs w.r.t variables.

Parameters
----------
outputs: ArrayRef[AI::MXNet::NDArray]

Returns
-------
gradients: ArrayRef[AI::MXNet::NDArray]

grad_and_loss

Return function that computes both gradient of arguments and loss value.

Parameters
----------
$func: CodeRef
    The forward (loss) function.
$argnum: Maybe[Int|ArrayRef[Int]]
    The index of argument to calculate gradient for.

Returns
-------
grad_and_loss_func: CodeRef
    A function that would compute both the gradient of arguments and loss value.

grad

Compute the gradients of heads w.r.t variables. Gradients will be
returned as new NDArrays instead of stored into `variable.grad`.
Supports recording gradient graph for computing higher order gradients.

Note: Currently only a very limited set of operators support higher order
gradients.

Parameters
----------
$heads: AI::MXNet::NDArray|ArrayRef[AI::MXNet::NDArray]
    Output NDArray(s)
$variables: AI::MXNet::NDArray|ArrayRef[AI::MXNet::NDArray]
    Input variables to compute gradients for.
:$head_grads=: Maybe[AI::MXNet::NDArray|ArrayRef[AI::MXNet::NDArray|Undef]]
    Gradients with respect to heads.
:$retain_graph=: Bool
    Whether to keep computation graph to differentiate again, instead
    of clearing history and release memory. Defaults to the same value
    as create_graph.
:$create_graph=0: Bool
    Whether to record gradient graph for computing of higher order gradients.
$train_mode=1: Bool, optional
    Whether to do backward for training or prediction.

Returns
-------
AI::MXNet::NDArray|ArrayRef[AI::MXNet::NDArray]:
    Gradients with respect to variables.

Examples
--------
>>> $x = mx->nd->ones([1]);
>>> $x->attach_grad();
>>> mx->autograd->record(sub {
        $z = mx->nd->elemwise_add(mx->nd->exp($x), $x);
    });
>>> $dx = mx->autograd->grad($z, [$x], create_graph=>1)
>>> $dx->backward();
>>> print($dx->grad->aspdl)
[3.71828175]

train_mode

Executes $sub within an autograd training scope context.
Parameters
----------
$sub: CodeRef

predict_mode

Executes $sub within an autograd predicting scope context.
Parameters
----------
$sub: CodeRef

record

Executes $sub within an autograd recording scope context
and captures code that needs gradients to be calculated.
Parameters
----------
$sub: CodeRef
:$train_mode=1 : Maybe[Bool]

pause

Executes $sub within an autograd recording scope context
and captures code that needs gradients to be calculated.
Parameters
----------
$sub: CodeRef
:$train_mode=0 : Maybe[Bool]

get_symbol

Retrieve recorded computation history as `Symbol`.

Parameters
----------
$x : AI::MXNet::NDArray
    AI::MXNet::NDArray representing the head of computation graph.
Returns
-------
AI::MXNet::Symbol
    The retrieved Symbol.