You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, @mohammadpz . Thanks for your attribution that implement the forward forward algorithm simply.
I have a two question about modification of the code. After reviewing this paper, the forward forward algorithm update the weight during the forward propagation of each layer, the locally back propagation emerged. (forward first layer -> backward first layer -> forward second layer -> backward second layer)
I suppose that calculation of the backward propagation can be a parallel in this code because we already know all gradients of the layer after forward propagation. But,
I don't know whether the parallelized back propagation helpful or better than the locally back propagation.
If I change this code to make calculation of backward parallel, then do I have to fix "loss.backward()"?
The text was updated successfully, but these errors were encountered:
Hi, @mohammadpz . Thanks for your attribution that implement the forward forward algorithm simply.
I have a two question about modification of the code. After reviewing this paper, the forward forward algorithm update the weight during the forward propagation of each layer, the locally back propagation emerged. (forward first layer -> backward first layer -> forward second layer -> backward second layer)
I suppose that calculation of the backward propagation can be a parallel in this code because we already know all gradients of the layer after forward propagation. But,
The text was updated successfully, but these errors were encountered: