The London Perl and Raku Workshop takes place on 26th Oct 2024. If your company depends on Perl, please consider sponsoring and/or attending.
0.15
20050206

I've rushed another release out because I found a big fat bug
in feedforward. Instead of applying the transfer function to
the summed node inputs, it was applying it to each input in
turn, then again to the sum. Wups! The network is now more
stable and a lot faster.

I also altered the debug calls, so in normal use without debug
the overhead is a lot smaller.

This version, particular in use with momentum, is a LOT faster.

###############################################################

0.14
20050201

The backprop calculation was nailed to 1-y*y, which is the
derivative of tanh, so only tanh hidden layers would work.
Thats now been removed, and a function in linear & tanh called
<activation function>_slope implemented, which returns the
slope. I haven't implemented a slope function in sigmoid, for
the simple reason that I need to learn how to integrate the
function to find out what the slope is! 

This version introduces a new, friendlier constructor. Instead
of the pair of anonymous hashes previously required, the 
constructor now takes fairly normal looking parameters. Layers
are added to a network using the add_layer method. This combines
a more intuitive approach with more flexibility.
The original constructor is not deprecated. Parameter names with
spaces in on the other hand ('activation function') are. They
should be replaced by the same parameter name without any spaces.
The new constructor will convert old format parameter names, but
its at the users risk.
See xor.pl or the perldoc & readme for examples.

Cleaned up the perldoc some more. Commented out all the method
perldocs, so there is just the single block defining the 
distributions documentation, as advocated by perlmonks. Method
perldocs in importable modules have not been commented out.

Removed the weight bounding in backprop & momentum. If the network
is going into an unstable state the weight bounding won't help,
and it causes errors under perl -w.

Implemented tests (apologies to the CPAN testers for not having
done so before!).


#################################################################

0.13
20050121

New plugin training algorithm - momentum.pm
Improvement in speed using momentum on xor as follows (epochs)
mom 0.6
334
288
124
502
7433
avg=1736.2

no mom
641
718
20596
19964
660
avg=8515.8

Significant, I think you'll agree!

Fixed a bug that allowed training to proceed even if the activation
function (s) can't be loaded.

################################################################

0.12
20050116

Fixed a bug in reinforce.pm reported by G M Passos
Inserted a catch in feedforward->run to call datasets if the syntax

$network->run($dataset) is called.

Strictly speaking this doesn't fit with the design, but its likely to
be used quite a bit so its worth catching


###############################################################

0.11
20050115
Added PNG support to AI::NNFlex::draw

Added AI::NNFlex::Dataset
This creates a dataset object that can be run against a
network

Added AI::NNFlex::lesion
Damages a network with a probability of losing a node
or a connection. See the perldoc

Cleaned up the POD docs a bit, although theres a lot still
to do.

################################################################