-
Notifications
You must be signed in to change notification settings - Fork 1
3. The eNNpi Application
eNNpi is an application that exposes all of the functions of the nn class for use via command line switches. It can be used singly or as part of a larger shell script.
In the git repository there are two versions. The version in the Debug directory has the _DEBUG_ preprocessor switch defined. This means that the application will generate a line-by-line commentary of all of the input files as they are read. This is useful to track down the exact line where a format error has occurred.
The Release does not have _DEBUG_ defined and generates no such output.
nn Class keeps track of a number of versions which are reflected in the naming default name generated by the -s switch. This is to provide a unique name that documents when in the training process the file was saved. Since training is iterative and the end point is not known in advance, calling the -s switch during the process will progressively save the network for a subsequent test phase where the best performing network can be selected.
The versions are:
Major Version: The major version is incremented each topology change or a layer is changed. Currently major version is incremented each time the alter switches are used (-am or -at).
Minor Version: The minor version is incremented each time a network is randomised.
Revision: The revision gets incremented each time a training set is used to train the network.
The default file name used by -s is:
<network Name>_<majorVersion>_<minorVersion>_<revision>.enn
Loads an existing network. The .enn extension is not checked. The content must be as described in the .enn file definition. Normal bash file naming rules apply (i.e. if there is no path then . is assumed).
Creates a new, randomised network. The default learning rate is 0.1 and there is no input bias node.
Create takes four arguments:
%in : the number of input nodes
%hidden : the number of hidden nodes
%out : the number of output nodes
%name : the network name which will form the first part of the default file name (see -s)
Train the network using the data in the named file. The file must have the format as described in the training file format document.
To give the user an idea as to how good the training is, a line is sent to stdout on completion of the training sequence containing the error of each output node as at the last training pass. E.g. for a five node output:
Last Training Error Vector - 0: 0.0115796 1: -8.34465e-07 2: -0.000260205 3: -0.000303328 4: -0.00119881
A network must be loaded or created before -t is called.
Randomise the link weights and node biases within the network thereby starting a new training sequence. A network must be loaded or created before -rand is called.
Run the network using the data in the named file. The format must be in the format described in the data file format document.
The output (for a two output node network) will have the following form:
Index: 0 results - 0: 0.654545 1: 0.55241
Index: 1 results - 0: 0.654545 1: 0.55241
Index: 2 results - 0: 0.654545 1: 0.55241
There will be as many rows as there are in the input file and as many columns as there are output nodes.
A network must be loaded or created before -r is called.
Test the network using the training set in the named file. The file must have the format as described in the training file format document. The network weights and node biases are not updated with this switch. A network must be loaded or created before -test is called.
Test (for a two input, two output network) produces a tab separated file of the following form:
Index | Input:0 | Input:1 | Desired:0 | Desired:1 | Actual:0 | Actual:1 | Error:0 | Error:1 |
0 | -0.820886 | -0.829642 | 1 | -1 | 0.999741 | 0.999975 | -0.000259101 | 1.99997 |
1 | -0.0327848 | -0.379471 | 1 | -1 | 0.999946 | 0.999994 | -5.37038e-05 | 1.99999 |
Index is the row index from the data file.
Input 0 and Input 1 are the first and second input value for that row
Desired 0 and 1 are the first and second desired output
Actual 0 and 1 are the first and second value that the network actually produced and
Error 0 and 1 are the differences between Desired 0 and 1 and Actual 0 and 1.
There will be as many rows as there are in the input set and there will be as many Input columns as input nodes and as many Desired Actual and Error columns as there are output nodes. This file is designed to be loaded into a spreadsheet or database where further analysis can take place.
Save the network using the default file name in the directory named by %path. %path must exist. See Auto Naming of Networks above for details of the default file name. A network must be loaded or created before -s is called.
Change the network topology. On completion the new network will have
%in : input nodes
%hidden : hidden nodes
%out : output nodes
and will be randomised. The major version will be incremented. A network must be loaded or created before -at is called.
Add or remove a bias node from layer 0 (the input layer). Currently this is the only option of this type. A network must be loaded or created before -am is called.
By default eNNpi a message “Done with <switch>” after each switch is executed. Quiet mode suppresses these messages. The -q+ switches quiet mode on (i.e. the messages off) and -q- switches message back on. Quiet mode can be changed at any time.
The following example from the Shuttle test data example creates a new network with 7 input, 7 hidden and 5 output nodes. The training data is broken up into 4 sets. The network is saved after creation and after each training set has been used. The network is randomised four times and then the topology is changed to 7 input, 8 hidden and 5 output nodes and all four training sets are run again, and again after two further randomisations.
# the EXE shell variable chooses which executable to use # the release version EXE="/home/pi/git/piGit/eNNpi/Release" # OR # the debug version #EXE="/home/pi/git/piGit/eNNpi/Debug" # uncomment the version that you want and comment the version that you don't want $EXE/eNNpi '''-c 7 7 5 shuttle''' -s . -t ./Set1.tr -s . -t ./Set2.tr -s . -t ./Set3.tr -s . -t ./Set4.tr -s . '''-rand''' -s . -t ./Set1.tr -s . -t ./Set2.tr -s . -t ./Set3.tr -s . -t ./Set4.tr -s . '''-rand''' -s . -t ./Set1.tr -s . -t ./Set2.tr -s . -t ./Set3.tr -s . -t ./Set4.tr -s . '''-rand''' -s . -t ./Set1.tr -s . -t ./Set2.tr -s . -t ./Set3.tr -s . -t ./Set4.tr -s . '''-rand''' -s . -t ./Set1.tr -s . -t ./Set2.tr -s . -t ./Set3.tr -s . -t ./Set4.tr -s . '''-at 7 8 5''' -s . -t ./Set1.tr -s . -t ./Set2.tr -s . -t ./Set3.tr -s . -t ./Set4.tr -s . '''-rand''' -s . -t ./Set1.tr -s . -t ./Set2.tr -s . -t ./Set3.tr -s . -t ./Set4.tr -s . '''-rand''' -s . -t ./Set1.tr -s . -t ./Set2.tr -s . -t ./Set3.tr -s . -t ./Set4.tr -s .
This script is also from Shuttle and tests each generated network using a unseen training set in quiet mode and sends the output to a file with a .res extension.
EXE="/home/pi/git/piGit/eNNpi/Release" #EXE="$HOME/git/piGit/eNNpi/Debug" for i in `ls *.enn`; do $EXE/eNNpi -q+ -n $i -test ./shuttle7_5_Sets1-4.tr > $i.res done
If the application does not recognise a switch (e.g. -w) it produces the following output:
Don't know that one:-w -n %file load a network from %file -t %file training set %file and write the final error vector to standard output (-r | -run) %file run from input set from inPath and write each result vector to standard output -rand randomise the network -test %file run the input patter from training file %file and write the final difference vector to standard output -s %path save on %path (with no trailing // -c %i %h %o %n create a new randomised network with %i input nodes %h hidden nodes %o output nodes, called %n (with a learning rate of 0.1) -at %i %h %o alter the topology to be %i input nodes %h hidden nodes %o output nodes -am 1 (biasNode:true OR biasNode:false) add or remove an input bias node to layer one