Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel backward implementation #13

Open
fryegg opened this issue Mar 16, 2023 · 0 comments
Open

Parallel backward implementation #13

fryegg opened this issue Mar 16, 2023 · 0 comments

Comments

@fryegg
Copy link

fryegg commented Mar 16, 2023

Hi, @mohammadpz . Thanks for your attribution that implement the forward forward algorithm simply.
I have a two question about modification of the code. After reviewing this paper, the forward forward algorithm update the weight during the forward propagation of each layer, the locally back propagation emerged. (forward first layer -> backward first layer -> forward second layer -> backward second layer)
I suppose that calculation of the backward propagation can be a parallel in this code because we already know all gradients of the layer after forward propagation. But,

  1. I don't know whether the parallelized back propagation helpful or better than the locally back propagation.
  2. If I change this code to make calculation of backward parallel, then do I have to fix "loss.backward()"?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant