Refactor forward
and backward
methods to allow passing a batch of data instead of one sample at a time
#156
Labels
enhancement
New feature or request
In support of #155.
This will impact the
forward
andbackward
methods in:network
typelayer
typedense_layer
typeconv2d_layer
typeEffectively, rather than looping over sample in a batch inside of
network % train
, we will pass batches of data all the way down to the lowest level, that is, theforward
andbackward
methods ofdense_layer
andconv2d_layer
types. Lowering the looping over the sample in a batch will also allow the implementation of abatchnorm_layer
.It will also potentially allow more efficient
matmul
s in dense and conv layers if we replace the stockmatmul
with some more specialized and efficientsgemm
or similar from some flavor of BLAS or MKL.The text was updated successfully, but these errors were encountered: