From 2fe5e0b4f8bdc5df4fc0def5939d8838618ffe1e Mon Sep 17 00:00:00 2001 From: Zhengkai Jiang Date: Wed, 28 Feb 2024 23:18:26 +0800 Subject: [PATCH] Update README.md --- Large-DiT/README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/Large-DiT/README.md b/Large-DiT/README.md index 8bf2834..5b4bcc8 100644 --- a/Large-DiT/README.md +++ b/Large-DiT/README.md @@ -3,7 +3,7 @@ We release the ***Large Diffusion Transformer*** (**L-DiT-3B** & **L-DiT-7B** 🔥), inspired by the methodologies of [LLaMA](https://github.com/facebookresearch/llama) and [DiT](https://github.com/facebookresearch/DiT). ![image](./assets/sample_t2i.png) -![image](./assets/sample2.jpg) +![image](./assets/sample.png) Compared to DiT-XL/2, our L-DiT-7B: @@ -26,7 +26,7 @@ We observe instability issues during the training of the original DiT, particula ### ImageNet 256x256 Samples and Benchmark
-
+