The London Perl and Raku Workshop takes place on 26th Oct 2024. If your company depends on Perl, please consider sponsoring and/or attending.

NAME

AI::NNFlex - A customisable neural network simulator

SYNOPSIS

use AI::NNFlex;

my $network = AI::NNFlex->new([array of hashes],{hash of global config});

use AI::NNFlex::Dataset;

my $dataset = AI::NNFlex::Dataset->new([ [INPUTARRAY],[TARGETOUTPUT], [INPUTARRAY],[TARGETOUTPUT]]);

my $sqrError = 10;

while ($sqrError >0.01)

{

        $sqrError = $dataset->learn($network);

}

$network->lesion({'nodes'=>PROBABILITY,'connections'=>PROBABILITY});

my $outputsRef = $dataset->run($network);

DESCRIPTION

AI::NNFlex is intended to be a highly flexible, modular NN framework. It's written entirely in native perl, so there are essentially no prereq's. The following modular divisions are made:

        * NNFlex.pm
                the core module. Contains methods to construct and
                lesion the network

        * feedforward.pm
                the network type module. Feedforward is the only type
                currently defined, but others may be created and
                imported at runtime

        * backprop.pm
                the learning algorithm. Backprop is the only algorithm
                currently defined, but others may be created and 
                imported at runtime

        * <activation>.pm
                node activation function. Currently the options are
                tanh, linear & sigmoid.

        * Dataset.pm
                methods for constructing a set of input/output data
                and applying to a network.

The code should be simple enough to use for teaching purposes, but a simpler implementation of a simple backprop network is included in the example file bp.pl. This is derived from Phil Brierleys freely available java code at www.philbrierley.com.

AI::NNFlex leans towards teaching NN and cognitive modelling applications. Future modules are likely to include more biologically plausible nets like DeVries & Principes Gamma model.

CONSTRUCTOR

AI::NNFlex

new ( { HASH OF NETWORK STRUCTURE },{ HASH OF CONFIG OPTIONS });

network structure

This should contain sub-hashes for each layer in the network, containing the following options.

        nodes=>NUMBER OF NODES

        decay=>AMOUNT OF ACTIVATION TO DECAY PER TICK

        persistent activation=>TRUE TO RETAIN ACTIVATION BETWEEN
                                TICKS

        random activation=>MAXIMUM VALUE FOR INITIAL ACTIVATION
        

config options

This should contain the global config options for the net. The following are defined:

        random weights=>MAXIMUM VALUE FOR INITIAL WEIGHT

        learning algorithm=>The AI::NNFlex module to import for
                training the net

        networktype=>The AI::NNFlex module to import for flowing
                activation

        debug=>[LIST OF CODES FOR MODULES TO DEBUG]

AI::NNFlex::Dataset

new ( [[INPUT VALUES],[OUTPUT VALUES],[INPUT VALUES],[OUTPUT VALUES],..])

INPUT VALUES

These should be comma separated values. They can be applied to the network with ::run or ::learn

OUTPUT VALUES

These are the intended or target output values. Comma separated These will be used by ::learn

METHODS

This is a short list of the main methods. For details on all available methods, please see individual pod pages below, and in individual imported modules.

AI::NNFlex

lesion

$network->lesion ({'nodes'=>PROBABILITY,'connections'=>PROBABILITY})

Damages the network.

PROBABILITY

A value between 0 and 1, denoting the probability of a given node or connection being damaged.

Note: this method may be called on a per network, per node or per layer basis using the appropriate object.

AN::NNFlex::Dataset

learn

$dataset->learn($network)

'Teaches' the network the dataset using the networks defined learning algorithm. Returns sqrError;

run

$dataset->run($network)

Runs the dataset through the network and returns a reference to an array of output patterns.

EXAMPLES

See the code in ./examples. For the simplest example, see xor_with_datasets.pl

ACKNOWLEDGEMENTS

Phil Brierley, for his excellent free java code, that solved my backprop problem

Dr Martin Le Voi, for help with concepts of NN in the early stages

Dr David Plaut, for help with the project that this code was originally intended for.

Graciliano M.P. for suggestions & improved code (see SEE ALSO).

SEE ALSO

AI::NNEasy - Developed by Graciliano M.P. Shares some common code with NNFlex. Much faster, and more suitable for backprop projects with large datasets.

TODO

Lots of things:

clean up the perldocs some more

write gamma modules

write BPTT modules

write a perceptron learning module

speed it up

write a tk gui

CHANGES

v0.11 introduces the lesion method, png support in the draw module and datasets.

COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

CONTACT

charlesc@nnflex.g0n.net

Below are PODs for individual methods

AI::NNFlex::new

constructor - creates new neural network

takes as params a reference to a hash. Each hash will be created as a layer.

Valid parameters are currently:

        * nodes                 -       number of nodes in the layer

        * decay                 -       float, amount that each node in the
                                        layer will decay with each tick.

        * persistent activation -       whether activation is summed between
                                        ticks.

        * adjust error          -       <NYI>

        * random activation     -       parameter to pass to RAND for random
                                        activation seeding

        Additional parameters may be specified as individual param=>value pairs
        AFTER the layer hash. These may include:

        * random weights        -       parameter to pass to RAND for random
                                        weights seeding
        
        * random connections    -       The /probability/ factor of a connection
                                        being made between two nodes in the
                                        network.

                                        (Note that no support exists at present
                                        for restricting the connectivity at a
                                        layer by layer level, although this may
                                        be done by combining two or more network
                                        objects and handling activation from one
                                        to the next programmatically)

        * learning algorithm    -       the learning algorithm (this must be a
                                        valid compatible perl module).

        * networktype           -       E.g. feedforward. Must be a compatible
                                        perl module.

        * debug                 -       level of debug information
                                        a different debug level is assigned to
                                        each module type, and the debug property is
                                        an ARRAY of which debugs you require.
                                        0 - error
                                        1 - TBD
                                        2 - NNFlex.pm core debug
                                        3 - networktype debug
                                        4 - learning algorithm debug
                                        5 - activation function debug
                                        6 - GUI/Graphic

Plus other custom settings used in networktype & learning algorithm modules, such as:

        * learning rate         -       A constant for use in e.g. backprop learning

Returns a network object that contains $$network->{'layers'} which is an array of 'layer' objects.

The layer object contains a property 'nodes' which is an array of nodes in that layer. So programmatically if you want to access a particular node (or to interact with the mesh for writing networktypes and learning algorithms) you can access any node directly using the syntax

$network->{'layers'}->[layer number]->{'nodes'}->[node number]->{property}

(HINT: or do foreach's on the arrays)

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

AI::NNFlex::output

$object->output({"output"=>"1"}); returns the activation of layer 1

else returns activation of last layer as a reference to an array

AI::NNFlex::init

called from AI::NNFlex::new. no external use required, but not defined as local, in case of debugging use

Init runs through each layer of node objects, creating properties in each node:

        * connectedNodesEast    -       Nodes connected to this node in the layer
                                        to the 'east', i.e. next layer in feedforward
                                        terms.

        * connectedNodesWest    -       Nodes connected to this node in the layer to
                                        the west, i.e. previous layer in feedforward
                                        networks.

These properties are hashes, with the node object acting as a key. Each value is a weight for this connection. This means that you /can/ have connection weights for connections in both directions, since the weight is associated with an incoming connection.

access with the following syntax:

$node->{'connectedNodesWest'}->{'weights'}->{$connectedNode} = 0.12345 $node->{'connectedNodesWest'}->{'nodes'}->[number] = $nodeObject

Note also that the probability of a connection being created is equal to the numeric value of the global property 'random connections' expressed as a decimal between 0 and 1. If 'random connections' is not specified all connections will be made.

The connections are /only/ created from west to east. Connections that already exist from west to east are just copied for the 'connectedNodesWest' property.

No return value: the connections are created in the $network object.

These connectedNodesEast & West are handy, because they are arrays you can foreach them to iterate through the connected nodes in a layer to the east or west.

AI::NNFlex::$network->dbug

Internal use, writes to STDOUT parameter 1 if parameter 2 == global variable $DEBUG. or parameter 2 == 1

AI::NNFlex::dump_state

$network->dump_state({"filename"=>"test.wts"[, "activations"=>1]});

Dumps the current contents of the node weights to a file.

load_state

useage: $network->load_state(<filename>);

Initialises the network with the state information (weights and, optionally activation) from the specified filename.

Note that if you have a file containing activation, but you are not using persistent activation, the activation states of nodes will be reset during network->run

AI::NNFlex::lesion

  • Calls node::lesion for each node in each layer

    Lesions a node to emulate damage. Syntax is as follows

    $network->lesion({'nodes'=>.2,'connections'=>.4});

    assigns a .2 probability of a given node being lesioned, and .4 probability of a given connection being lesioned. Either option can be omitted but it must have one or the other to do. If you programmatically need to call it with no lesioning to be done, call with a 0 probability of lesioning for one of the options.

    return value is true if successful;

AI::NNFlex::layer

The layer object

AI::NNFlex::layer::new

Create new layer

Takes the parameters from AI::NNFlex::layer and passes them through to AI::NNFlex::node::new

(Uses nodes=>X to decide how many nodes to create)

Returns a layer object containing an array of node objects

AI::NNFlex::layer::layer_output

Receives a reference to a hash of parameters. Valid inputs are

        * layer         -       the layer number you want output from

Returns a reference to an array of outputs

Used by AI::NNFlex::output

AI::NNFlex::layer::lesion

  • Calls node::lesion for each node in the layer

    Lesions a node to emulate damage. Syntax is as follows

    $layer->lesion({'nodes'=>.2,'connections'=>.4});

    assigns a .2 probability of a given node being lesioned, and .4 probability of a given connection being lesioned. Either option can be omitted but it must have one or the other to do. If you programmatically need to call it with no lesioning to be done, call with a 0 probability of lesioning for one of the options.

    return value is true if successful;

AI::NNFlex::node

the node object

AI::NNFlex::node::new

Takes parameters passed from NNFlex via AI::NNFlex::layer

returns a node object containing:

        * activation            -       the nodes current activation

        * decay                 -       decay rate

        * adjust error          -       NYI

        * persistent activation -       if true, activation will be summed on
                                        each run, rather than zeroed
                                        before calculating inputs.

        * ID                    -       node identifier (unique across the
                                        NNFlex object)

        * threshold             -       the level above which the node
                                        becomes active

        * activation function   -       the perl script used as an activation
                                        function. Must perform the calculation
                                        on a variable called $value.

        * active                -       whether the node is active or
                                        not. For lesioning. Set to 1
                                        on creation

 NOTE: it is recommended that nothing programmatic be done with ID. This
 is intended to be used for human reference only.

AI::NNFlex::node::lesion

  • Lesions a node to emulate damage. Syntax is as follows

    $node->lesion({'nodes'=>.2,'connections'=>.4});

    assigns a .2 probability of a given node being lesioned, and .4 probability of a given connection being lesioned. Either option can be omitted but it must have one or the other to do. If you programmatically need to call it with no lesioning to be done, call with a 0 probability of lesioning for one of the options.

    return value is true if successful;

    Implemented as a method of node to permit node by node or layer by layer lesioning

6 POD Errors

The following errors were encountered while parsing the POD:

Around line 439:

Unknown directive: =head

Around line 853:

'=item' outside of any '=over'

Around line 912:

You forgot a '=back' before '=head1'

Around line 1007:

'=item' outside of any '=over'

Around line 1061:

You forgot a '=back' before '=head1'

Around line 1161:

'=item' outside of any '=over'

=over without closing =back