NAME
AI::Perceptron - An implementation of a Perceptron
SYNOPSIS
use AI::Perceptron;
DESCRIPTION
This module is meant to show how a single node of a neural network works to beginners in the field.
The only mode of training the weights supported at this point in time is the Stochastic Approximation of the Gradient-Descent model.
CONSTRUCTOR
- new( [%args] )
-
Creates a new perceptron with the following properties:
Inputs => number of inputs (scalar) N => learning rate (scalar) W => array ref of weights (applied to the inputs)
The Number of elements in W must be equal to the number of inputs plus one. This is because W[0] is the Perceptron's threshold (so W[1] corresponds to the first input's weight).
Default values are: 1, 0.05, and [random], respectively.
METHODS
- weights( [@W] )
-
Sets/gets the perceptron's weights. This is useful between training sessions to see if the weights are actually changing. Note again that W[0] is the Perceptron's threshold.
- train( $n, $training_examples )
-
This uses the Stochastic Approximation of the Gradient-Descent model to adjust the perceptron's weights in such a way to achieve the desired outputs given in the training examples.
Note that this training method may undo previous trainings!
SEE ALSO
Statistics::LTU
, the ASCII model contained in Perceptron.pm.
REFERENCES
Machine Learning, by Tom M. Mitchell
AUTHOR
Steve Purkis <spurkis@epn.nu>
COPYRIGHT
Copyright (c) 1999, 2000 Steve Purkis. All rights reserved. This package is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
1 POD Error
The following errors were encountered while parsing the POD:
- Around line 218:
Deleting unknown formatting code U<>