You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: demos/fnn/demo.lisp
+120-35
Original file line number
Diff line number
Diff line change
@@ -33,28 +33,50 @@ MakeNormalArray ← {
33
33
∘○ To Verify ○∘
34
34
Normally Distributed Array Elements
35
35
36
-
Evaluate (verify-normal-array) to see an array of normally-distributed floating point numbers. Optionally, pass a shape like #(2 3 4) to this function to receive an array of the corresponding shape.
36
+
* (verify-normal-array)
37
37
38
-
Evaluate (verify-normal-distrib) to see a vector representing the distribution of values generated by MakeNormalArray.
38
+
Evaluate this to see an array of normally-distributed floating point numbers. Optionally, pass a shape like #(2 3 4) to this function to receive an array of the corresponding shape.
39
39
40
-
Evaluate (verify-plot-normal-distrib) to see a plot of the value distribution generated by MakeNormalArray. Try smaller counts as with (verify-plot-normal-vector 100) to see a less normal distribution and larger counts like 10,000 to see a more normal distribution. You may pass a width value as the second argument to control the width of the plot, i.e. (verify-plot-normal-vector :count 1000 :width 40) will produce a plot whose rows may be no longer than 40 characters.
40
+
You can optionally pass a set of dimensions to the function:
41
+
42
+
(verify-normal-array 2 3)
43
+
44
+
* (verify-normal-distrib)
45
+
46
+
Evaluate this to see a vector representing the distribution of values generated by MakeNormalArray. This function optionally takes a length argument to determine the length of the normal vector created:
47
+
48
+
(verify-normal-distrib 100)
49
+
50
+
* (verify-plot-normal-distrib)
51
+
52
+
Evaluate this to see a plot of the value distribution generated by MakeNormalArray. To use less elements and see a less normal distribution, try:
53
+
54
+
(verify-plot-normal-distrib :count 100)
55
+
56
+
Or try with larger counts like 10,000 to see a more normal distribution. You may pass a width argument to control the width of the plot:
(formatnil"Distribution of ~a random numbers plotted with width ~a."count width))
58
80
59
81
#|
60
82
-- Functions --
@@ -90,14 +112,19 @@ InitWeightMatrices ← {
90
112
∘○ To Verify ○∘
91
113
Neural Network Structure
92
114
93
-
Evaluate (verify-network-structure) to see the printed structure of a random neural network. Optionally, choose a shape for the network by passing a vector of dimensions to the function, as with (verify-network-structure #(2 5 3)).
115
+
* (verify-network-structure)
116
+
117
+
Evaluate this to see the printed structure of a random neural network. Optionally, you can set a shape for the network by passing a set of dimensions as the argument, like this:
"{⎕←display InitNetwork ⍵ ⋄ 'Printed network strucure with base shape ',(⍕⍵),'.'}"
127
+
(coerce shape 'vector))))
101
128
102
129
#|
103
130
-- Function --
@@ -161,17 +188,32 @@ DF ← {
161
188
∘○ To Verify ○∘
162
189
Activation and Neural Network Output
163
190
164
-
Evaluate (verify-activation-output) to see the output of the LeakyReLU function with an optional input value and its standard leaky parameter of 0.1. Evaluate (verify-network-output) to see the printed output of a random neural network. As with (verify-network-structure), you can optionally choose a shape for the input network.
191
+
* (verify-activation-output)
192
+
193
+
Evaluate this to see the output of the LeakyReLU function. You can pass it a series of input values:
194
+
195
+
(verify-activation-output 10.0d0 -20.0d0 30.0d0)
196
+
197
+
* (verify-network-output)
198
+
199
+
Evaluate this to see the printed output of a random neural network. As with (verify-network-structure), you can optionally set a shape:
'Printed output of neural network with shape ',(⍕⍵),'.'}"
216
+
(coerce shape 'vector))))
175
217
176
218
#|
177
219
-- Function --
@@ -209,9 +251,23 @@ DF ← {
209
251
∘○ To Verify ○∘
210
252
Loss Function and Its Derivative
211
253
212
-
Evaluate (verify-loss-convergence) to see a vector of the output from the derivative loss function with a given input and the output of the algorithmically derived loss function given the same input and a given dx value. For example, you can run (verify-loss-convergence :input 5 :dx 0.1) To see the same with a series of dx values, evaluate (verify-loss-convergence-series), optionally with input as with (verify-loss-convergence-series :input 5 :series #(0.1d0 0.01d0 0.001d0)).
254
+
* (verify-loss-convergence)
255
+
256
+
Evaluate this to see a vector of the output from the derivative loss function with a given input and the output of the algorithmically derived loss function given the same input and a given dx value. For example:
257
+
258
+
(verify-loss-convergence :input 5 :dx 0.1)
259
+
260
+
* (verify-loss-convergence-series)
261
+
262
+
Evaluate this to see the same with a series of dx values, optionally with input:
Evaluate this to see the loss value for a given input and output to a network. You can set a shape for the network as well as specifying the input and target values:
Evaluate (verify-training-output) to either 1. initialize a neural network if none is stored or 2. perform an iteration of training upon the stored neural network. You can start again with a fresh network by evaluating (verify-training-output-restart). As with (verify-loss-applied), (verify-training-output) can take 3 arguments specifying the shape of the network, its input and its target.
342
+
* (verify-training-output)
343
+
344
+
Evaluate this to either 1. initialize a neural network if none is stored or 2. perform an iteration of training upon the stored neural network.
345
+
346
+
* (verify-training-output-restart)
347
+
348
+
Evaluate this to start again with a fresh neural network.
349
+
350
+
As with (verify-loss-applied), (verify-training-output) can take 3 arguments specifying the shape of the network, its input and its target:
The input and target values will be reshaped into vectors matching the first and last dimensions of the network, respectively. Changing the shape, input or target values when a network exists will cause the network to be rebuilt with those values applying.
287
355
|#
@@ -291,9 +359,9 @@ The input and target values will be reshaped into vectors matching the first and
291
359
(defunverify-training-output-restart ()
292
360
"Clear the network state of the training output test function."
@@ -381,11 +449,27 @@ The following functions implement tools for importing the MNIST data into arrays
381
449
#|
382
450
○∘ To Demonstrate ∘○
383
451
384
-
Evaluate (load-digit-training-data) followed by (build-digit-network) to load the MNIST training data and build a neural network. Then, run (train-digit-network) to train the network. Optionally, a count may be passed to (train-digit-network) if you wish to train the network on a limited subset of the MNIST training data rather than training on all 60,000 images.
452
+
* (load-digit-training-data)
453
+
454
+
Evaluate this, followed by...
455
+
456
+
* (build-digit-network)
457
+
458
+
...to load the MNIST training data and build a neural network. Then:
459
+
460
+
* (train-digit-network)
461
+
462
+
Evaluate this to train the network. Optionally, a count argument may be passed if you wish to train the network on a limited subset of the MNIST training data rather than training on all 60,000 images:
463
+
464
+
(train-digit-network 2000)
385
465
386
466
The functions (get-net-state) (get-data-segment) and (set-data-segment) are utility functions which may be useful in analyzing the data passing through the network.
387
467
468
+
The (build-digit-network) function can optionally be passed a set of intermediate dimensions. For example:
469
+
470
+
(build-digit-network 12 18)
388
471
472
+
Will yield a training network with shape 728 12 18 10. The default shape is 728 16 16 10, which has been found to produce good results with the MNIST digit set, but you can use dimensional inputs to experiment with other structures.
0 commit comments