The AI::MXNet::Gluon
package
is a high-level interface
for
MXNet designed to be easy to
use
,
while
keeping most of the flexibility of a low level API.
AI::MXNet::Gluon supports both imperative and symbolic programming,
making it easy to train complex models imperatively in Perl.
Based on the Gluon API specification,
the Gluon API in Apache MXNet provides a clear, concise, and simple API
for
deep learning.
It makes it easy to
prototype
, build, and train deep learning models without sacrificing training speed.
Advantages.
Simple, Easy-to-Understand Code: Gluon offers a full set of plug-and-play neural network building blocks,
including predefined layers, optimizers, and initializers.
Flexible, Imperative Structure: Gluon does not
require
the neural network model to be rigidly
defined
,
but rather brings the training algorithm and model closer together to provide flexibility in the development process.
Dynamic Graphs: Gluon enables developers to define neural network models that are dynamic,
meaning they can be built on the fly,
with
any structure, and using any of Perl's native control flow.
High Performance: Gluon provides all of the above benefits without impacting the training speed that the underlying engine provides.
Simple, Easy-to-Understand Code
Use plug-and-play neural network building blocks, including predefined layers, optimizers, and initializers:
my
$net
= gluon->nn->Sequential;
$net
->name_scope(
sub
{
$net
->add(gluon->nn->Dense(256,
activation
=>
"relu"
));
$net
->add(gluon->nn->Dense(256,
activation
=>
"relu"
));
$net
->add(gluon->nn->Dense(
$num_outputs
));
});
Flexible, Imperative Structure.
Prototype, build, and train neural networks in fully imperative manner using the AI::MXNet::MXNet
package
and the Gluon trainer method:
my
$epochs
= 10;
for
(1..
$epochs
)
{
for
(zip(
$train_data
))
{
my
(
$data
,
$label
) =
@$_
;
autograd->record(
sub
{
my
$output
=
$net
->(
$data
);
my
$loss
= gluon->loss->softmax_cross_entropy(
$output
,
$label
);
$loss
->backward;
});
$trainer
->step(
$data
->shape->[0]);
}
}
Dynamic Graphs.
Build neural networks on the fly
for
use
cases where neural networks must change in size and shape during model training:
method forward(GluonClass
$F
, GluonInput
$inputs
, GluonInput :
$tree
)
{
my
$children_outputs
= [
map
{
$self
->forward(
$F
,
$inputs
,
$_
) @{
$tree
->children }
];
...
}
High Performance
Easily cache the neural network to achieve high performance by defining your neural network
with
HybridSequential
and calling the hybridize method:
my
$net
= nn->HybridSequential;
$net
->name_scope(
sub
{
$net
->add(nn->Dense(256,
activation
=>
"relu"
));
$net
->add(nn->Dense(128,
activation
=>
"relu"
));
$net
->add(nn->Dense(2));
});
$net
->hybridize();