Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor forward and backward methods to allow passing a batch of data instead of one sample at a time #156

Open
milancurcic opened this issue Aug 8, 2023 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@milancurcic
Copy link
Member

In support of #155.

This will impact the forward and backward methods in:

  • network type
  • layer type
  • dense_layer type
  • conv2d_layer type

Effectively, rather than looping over sample in a batch inside of network % train, we will pass batches of data all the way down to the lowest level, that is, the forward and backward methods of dense_layer and conv2d_layer types. Lowering the looping over the sample in a batch will also allow the implementation of a batchnorm_layer.

It will also potentially allow more efficient matmuls in dense and conv layers if we replace the stock matmul with some more specialized and efficient sgemm or similar from some flavor of BLAS or MKL.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants