You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your excellent works and sharing your code !
while I have questions about the code:
log_probs = torch.cat(
[
log_add_exp(log_x_start[:,:-1,:]+log_cumprod_at, log_cumprod_bt),
log_add_exp(log_x_start[:,-1:,:]+log_1_min_cumprod_ct, log_cumprod_ct)
],
dim=1
)
why log_add_exp is used , looking forward your reply!
The text was updated successfully, but these errors were encountered:
while I notice that in line 212 of diffusion_transformer.py,
log_qt=self.q_pred(log_x_t,t) ,which is called by q_posterior(self,log_x_start,log_x_t,t) .
I feel a little confused that why is log_x_t is used but not log_x_start
Thanks for your excellent works and sharing your code !
while I have questions about the code:
log_probs = torch.cat(
[
log_add_exp(log_x_start[:,:-1,:]+log_cumprod_at, log_cumprod_bt),
log_add_exp(log_x_start[:,-1:,:]+log_1_min_cumprod_ct, log_cumprod_ct)
],
dim=1
)
why log_add_exp is used , looking forward your reply!
The text was updated successfully, but these errors were encountered: