NAME
Lugh::Autograd - Automatic differentiation for Lugh tensors
SYNOPSIS
use Lugh;
use Lugh::Autograd;
my $ctx = Lugh::Context->new(mem_size => 16 * 1024 * 1024);
# Create tensors with gradient tracking
my $a = Lugh::Autograd::Tensor->new($ctx, 'f32', 4, { requires_grad => 1 });
my $b = Lugh::Autograd::Tensor->new($ctx, 'f32', 4, { requires_grad => 1 });
$a->set_data(1.0, 2.0, 3.0, 4.0);
$b->set_data(0.5, 1.0, 1.5, 2.0);
# Forward pass
my $c = Lugh::Autograd::Ops->add($ctx, $a, $b);
my $loss = Lugh::Autograd::Ops->sum($ctx, $c);
# Backward pass
$loss->backward();
# Access gradients
my $grad_a = $a->grad; # [1, 1, 1, 1]
my $grad_b = $b->grad; # [1, 1, 1, 1]
# Disable gradient tracking temporarily
Lugh::Autograd::no_grad {
my $inference_result = Lugh::Autograd::Ops->add($ctx, $a, $b);
# No gradient tracking, more memory efficient
};
DESCRIPTION
Lugh::Autograd provides automatic differentiation capabilities for training neural networks. It implements a dynamic computation graph that tracks operations and enables efficient gradient computation through backpropagation.
FUNCTIONS
is_grad_enabled
my $enabled = Lugh::Autograd::is_grad_enabled();
Returns true if gradient tracking is currently enabled.
set_grad_enabled
my $prev = Lugh::Autograd::set_grad_enabled($enabled);
Enable or disable gradient tracking. Returns the previous state.
no_grad
Lugh::Autograd::no_grad {
# Gradient tracking disabled in this block
my $result = Lugh::Autograd::Ops->add($ctx, $a, $b);
};
Executes a code block with gradient tracking disabled. The previous gradient tracking state is restored after the block completes.
CLASSES
Lugh::Autograd::Tensor
Tensor with gradient tracking support.
new($ctx, $type, @dims, \%options)
Create a new autograd tensor.
my $tensor = Lugh::Autograd::Tensor->new($ctx, 'f32', 10, 20, {
requires_grad => 1,
});
requires_grad([$value])
Get or set whether this tensor requires gradient tracking.
grad()
Returns the gradient as an array reference, or undef if no gradient.
zero_grad()
Zeros out the accumulated gradient.
backward([@grad_output])
Performs backpropagation from this tensor. For scalar losses, no arguments are needed. For non-scalar outputs, provide the gradient values.
is_leaf()
Returns true if this tensor is a leaf (created directly, not as output of an operation).
set_data(@values)
Set tensor data values.
get_data()
Get tensor data values as a list.
shape()
Returns the tensor dimensions as a list.
nelements()
Returns the total number of elements.
Lugh::Autograd::Ops
Operations that track gradients.
add($ctx, $a, $b)
Element-wise addition.
mul($ctx, $a, $b)
Element-wise multiplication.
sum($ctx, $a)
Sum all elements to a scalar.
AUTHOR
LNATION <email@lnation.org>
LICENSE
This is free software; you can redistribute it and/or modify it under the same terms as Perl itself.