Dave Cross: Still Munging Data With Perl: Online event - Mar 17 Learn more

NAME

AI::MXNet::AutoGrad - Autograd for NDArray.

set_is_training

Set status to training/not training. When training, graph will be constructed
for gradient computation. Operators will also run with ctx.is_train=True. For example,
Dropout will drop inputs randomly when is_train=True while simply passing through
if is_train=False.
Parameters
----------
is_train: bool
Returns
-------
previous state before this set.

mark_variables

Mark AI::MXNet::NDArrays as variables to compute gradient for autograd.
Parameters
----------
variables: array ref of AI::MXNet::NDArrays
gradients: array ref of AI::MXNet::NDArrays
grad_reqs: array ref of strings

backward

Compute the gradients of outputs w.r.t variables.
Parameters
----------
outputs: array ref of NDArray
out_grads: array ref of NDArray or undef
retain_graph: bool, defaults to false

compute_gradient

Compute the gradients of outputs w.r.t variables.
Parameters
----------
outputs: array ref of NDArray
Returns
-------
gradients: array ref of NDArray

grad_and_loss

Return function that computes both gradient of arguments and loss value.
Parameters
----------
func: a perl sub
The forward (loss) function.
argnum: an int or a array ref of int
The index of argument to calculate gradient for.
Returns
-------
grad_and_loss_func: a perl sub
A function that would compute both the gradient of arguments and loss value.

grad

Return function that computes gradient of arguments.
Parameters
----------
func: a perl sub
The forward (loss) function.
argnum: an int or arry ref of int
The index of argument to calculate gradient for.
Returns
-------
grad_func: a perl function
A function that would compute the gradient of arguments.