-
-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Losses for conditional diffusion models #15
Comments
Hi @vinayak-sharan , |
I am training on CelebHQ with masks and texts both as conditions. |
I dont have any logs but I think for this case(CelebHQ conditioned on masks and texts), by 50 epochs you should get decent generation output. |
Hey Tushar, I trained the ldm for 200 epochs and plotted the loss. The VQ-VAE samples are quite good. But the ldm sample is not what I expected :D Loss over the epochs, I noticed that it start increasing back after like 100 epochs. I am surprised, since it's a plot of train loss, it should be doing overfitting. |
Here are the checkpoints in case you are interested: https://drive.google.com/drive/folders/1N2lRCFKz-fshPs3hzIV7ym_gs9kkYmTT?usp=sharing |
I was never able to train for more than 100 epochs(cause of compute limitations), but the issue of increase in loss, I think should be reduced by adding a decay in learning rate, so maybe try with that. |
Hey thanks for the videos and codes, I am experimenting with conditional ldms.
Do you happen to have loss plots or logs of the loss? I have a feeling that the loss is decreasing really slowly or not decreasing at all.
Could you let me know if you had similar loss decrease? Here is the screenshot for your reference.
The text was updated successfully, but these errors were encountered: